Play anywhere in AR. On both Android and iOS.

Built with AR Foundation, bARpong lets you play your favorite beer-related cup and ball game anywhere you want, on both Android and iOS.


Accurate physics. Simulated in real time.

Making use of Unity's new data-oriented tech stack, we can simulate realistic physics and collisions. The ball behaves just as it does in real life.


Project goals

The concept

Our goal for bARpong were to create an engaging implementation of beer pong in Augmented Reality (AR). When the covid-19 pandemic prevented us from playing the real thing with our friends, we wanted to create a version that could be played without the risk of getting sick. The fact that there's no mess in AR is just a bonus.

The technology

The main technical aspects of the project are the AR graphics and the physics simulations. The lighting of the AR scene changes depending on the light around you in the real world, and on iOS devices, the AR content gets occluded by people passing by. The physics are simulated in real-time, which enables realistic interactions between the ball, table, and cups.


The team

These are the people behind bARpong, and what they've done.

Fredrik Lundkvist

Shaders, Physics, AR Onboarding, Light Estimation, Project Architecture, Project Management.

Hannes Bennet

Modeling, Materials, Audio Programming, Collision Detection, Networking, Android Testing

Jesper Lundqvist

Mobile Interactions, AR Graphics, iOS Testing, Visual Effects.

Mattias Larsson

User Interface, Networking, Game State Management, Game Design, Android Testing


Challenges and obstacles

Multiplayer

At the start, we set out to eventually make bARpong a multiplayer game. To simplify development, we decided to begin with a singleplayer implementation, and add online multiplayer when the core game was done. Unfortunately, adding networking turned out to be a much bigger task than we had expected, and we decided to use the little time we had left to polish the singleplayer experience instead.

Data-oriented tech stack

As with any mobile game, we knew that performance would be critical for this project; no matter how good a game is, playing at 15fps is never fun. Because of this, we decided to use Unity's Data-oriented technology stack, due to promises of better thread utilisation leading to better performance on weaker devices. sadly, this technology was much less mature than we were led to believe from our exploration. This meant that many things, such as audio systems and framrate-independent physics, became much harder to implement than they would have been using the standard way of working in unity (and also required a new way of thinking from us), and took more time than expected.


Lessons learned

Through the course of this project, we've learned quite a lot, about many different things. We've learned how to develop Augmented reality applications in Unity using AR Foundation, using the data-oriented tech stack for simulating physics, designing and programming mobile interactions, shader programming, networking, and advanced AR techniques such as light estimation and reflection probing. And that's just the tip of the iceberg! We've also learned some non-technical things, mostly related to managing and planning software projects; we've gotten hands-on experience with agile development methods, user testing, and iterative development.