top of page

From PC to Mobile: Development in Unity

Updated: Apr 17, 2020




Roughly about a year ago, I started working on a game that takes Tinder and makes you play as the algorithm behind the screen and match users together. Even before starting the project, before the mechanic was fully fleshed out, I knew that this game eventually had to be converted to mobile. It was the most logical step for the game seeing as its "source material" is a mobile dating app. It was also a perfect opportunity to expand on my UI/UX skills.


Pre - Production


Since I knew from the beginning that I wanted this game to be converted to mobile. That immediately gave me limitations to consider when I first started developing it for PC. Only the mouse could be used, the keyboard would only hold debug controls to allow for faster testing throughout the creation process. I had to take into account the constraints of mobile, only one button, dragging for the mouse, and no right-click. Establishing what I had to work with allowed me to create a mechanic that would easily convert over. I then spent the next three months developing the game and making sure that the experience was complete and the abstract narrative could be understood. Once that was complete, then I was able to focus on moving my PC game to my smartphone.


Conversion


Converting the game from PC to mobile is expected to take three months from September to the beginning of December as all the features are moved over and then undergo further polish.


Mechanics

As my mechanics were built with mobile in mind, the mechanical conversion went over very smoothly. The most intensive part was installing the correct apks and android development programs. I was planning on it taking a significantly larger amount of time as I was under the impression that I would have to re-write my interaction scripts to utilize the mobile functions that unity has. Surprisingly, the mouse functions that I had used during production automatically worked on my phone once the platform was switched over in the build settings.

The UI buttons also worked automatically with the UI OnClick() system which was a very pleasant surprise as I had thought that it would have to be converted to the event system which would have been a very involved process that would have taken days to complete.


Landscape to Portrait

What was admittedly the most tedious part of this process has to be changing the game from a landscape view to a portrait view. Since the Unity 2018.2 Prefab system does not save the transform's position, most items in the gameplay layer of the game had to be moved manually throughout all of the levels. The copy component value function was very useful for this move as it allowed me to maintain consistent level layouts between scenes. In all practicality, only the frame needed to be rotated and everything gameplay-wise had to be pulled together, slightly enlarged, and in some cases deleted.



Haptic Feedback

Initially, the plan for including haptic feedback was to have a variety of different patterns and strengths for different situations but I quickly discovered that the Unity haptic system is very limited. There is a single function that does not take in any variables that provide one vibration.

I had attempted to utilize it in my UI for when a button was selected, but the vibration was far too strong and far too powerful for that situation. Since I could only use it once, I had considered all the different areas of interaction that could benefit from additional feedback and chose the one that needed it the most.

Even before I started the conversion process, I knew that negative feedback was lacking in the game. Players would have a harder time knowing when they induced the lose conditions in the game. Because of this, I decided to give negative feedback to the vibration since it is strong enough to connotate an important event and is rare enough to not be distracting.

So far, testing has shown this to be successful as players now show a reaction to the negative feedback.


Visuals

One of the key lessons I learned about mobile visuals is that simplicity reigns overall. When I was initially developing the project, I had thought that the visuals were already incredible simple, with shapes and stark colors and minimal gradients exclusively for the UI. During testing for the conversion, it quickly became apparent that the visuals were still too complex and the gameplay was too busy. When the game was on PC, the larger screen made it easy to overlook the complexity of the visuals but once compressed on the phone, even when everything was crisp and clear, it was obvious that they needed to be simplified further. Eliminate small and sharp edges, simplify the background, decrease the number of neutral particles, simplify as much as possible.


Sliming down the UI

Continuing the train of simplification, the UI had to undergo the most radical update.



The first change that was made was the elimination of the "debug log." Throughout the scene, the log would have a running list of the actions of the users. When they logged off, when they logged on, when they matched correctly, when they got rejected, and when users were reported. It was an extra layer of feedback that was visually unnecessary. It was then cut from the mobile version.

The next change that occurred was the movement of the "Re-Pair" button. This feature was to be used whenever the player made a mistake whether it was pair two users that would have been better with another or report an innocent user. It would restart the scene and give the player another chance. Initially, it was grouped with the other UI elements at the top of the screen. On mobile, it became too crowded so it was moved to the bottom of the screen alone. The inspiration came from Pokemon Go as the more utilized actions were placed towards the bottom of the screen where the hand naturally rests.

With the simple vertical and split UI, the game started to feel more like a mobile game and less like a copy-paste of a PC game.


Testing

There are two avenues for mobile testing with Unity. The fastest way was with the Unity Remote which provides a live stream of the game from your computer to your mobile device through a wired connection. This allows for quick tweaks to be tested without having them create a full build and install it onto your mobile device. The downside is that since it is a stream of the game, the visual quality has a significant decrease and since it is from a PC, you are unable to test the haptic feedback. It was fantastic to use during the initial conversion as it allowed me to quickly test that the mobile input was working.

The best public testing method would be to create a full build for the device. This allows for complete testing of all the project's features. Since the haptics are limited, and people tend to play games on silent, visuals are key for mobile games so making sure they are at their highest quality in testing will provide the best results.


Going Forward


As the project is not complete yet, there are still many aspects of mobile development I will have to explore before releasing the game to the public. The first being development for different phones. The current project has been developed solely for a Samsung Galaxy S9 through virtue of that being my phone. I plan to develop the game for current models back to the Galaxy S7 as well as the S3. This will be an interesting lesson in how to properly develop for different phone screens and potentially tablet screens.

The goal is to finish the project for a December public release.

41 views0 comments
bottom of page