// transmission.log

Data Feed

> Intercepted signals from across the network — tech, engineering, and dispatches from the void.

1695 transmissions indexed — page 56 of 85

[ 2022 ]

20 entries
1101|blog.unity.com

Advanced Editor scripting hacks to save you time, part 1

On most of the projects I’ve seen, there are a lot of tasks developers go through that are repetitive and error-prone, especially when it comes to integrating new art assets. For instance, setting up a character often involves dragging and dropping many asset references, checking checkboxes, and clicking buttons: Set the rig of the model to Humanoid, disable the sRGB of the SDF texture, set the normal maps as normal maps, and the UI textures as sprites. In other words, valuable time is spent and crucial steps can still be missed.In this two-part article, I’ll walk you through hacks that can help improve this workflow so that your next project runs smoother than your last. To further illustrate this, I’ve created a simple prototype – similar to an RTS – where the units of one team automatically attack enemy buildings and other units. With each scripting hack, I’ll improve one aspect of this process, whether that be the textures or models.Here’s what the prototype looks like:The main reason developers have to set up so many small details when importing assets is simple: Unity doesn’t know how you are going to use an asset, so it can’t know what the best settings for it are. If you want to automate some of these tasks, this is the first problem that needs to be addressed.The simplest way to find out what an asset is for and how it relates to others is by sticking to a specific naming convention and folder structure, such as:Naming convention: We can append things to the name of the asset itself, therefore Shield_BC.png is the base color while Shield_N.png is the normal map.Folder structure: Knight/Animations/Walk.fbx is clearly an animation, while Knight/Models/Knight.fbx is a model, even though they both share the same format (.fbx).The issue with this is that it only works well in one direction. So while you might already know what an asset is for when given its path, you can’t deduce its path if only given information on what the asset does. Being able to find an asset – for example, the material for a character – is useful when trying to automate the setup for some aspects of the assets. While this can be solved by using a rigid naming convention to ensure that the path is easy to deduce, it’s still susceptible to error. Even if you remember the convention, typos are common.An interesting approach to solve this is by using labels. You can use an Editor script that parses the paths of assets and assigns them labels accordingly. As the labels are automated, it’s possible to figure out the exact label that an asset will have. You can even look up assets by their label using AssetDatabase.FindAssets.If you want to automate this sequence, there is a class that can be very handy called the AssetPostprocessor. The AssetPostprocessor receives various messages when Unity imports assets. One of those is OnPostprocessAllAssets, a method that’s called whenever Unity finishes importing assets. It will give you all the paths to the imported assets, providing an opportunity to process those paths. You can write a simple method, like the following, to process them:In the case of the prototype, let’s focus on the list of imported assets – both to try and catch new assets, as well as moved assets. After all, as the path changes, we might want to update the labels.To create the labels, parse the path and look for relevant folders, prefixes, and suffixes of the name, as well as the extensions. Once you have generated the labels, combine them into a single string and set them to the asset.To assign the labels, load the asset using AssetDatabase.LoadAssetAtPath, then assign its labels with AssetDatabase.SetLabels.Remember, it’s important to only set labels if they have actually changed. Setting labels will trigger a reimport of the asset, so you don’t want this to happen unless it’s strictly necessary.If you check this, then the reimport won’t be an issue: Labels are set the first time you import an asset and saved in the .meta file, which means they’re also saved in your version control. A reimport will only be triggered if you rename or move your assets.With the above steps complete, all assets are automatically labeled, as in the example pictured below.Importing textures into a project usually involves tweaking the settings for each texture. Is it a regular texture? A normal map? A sprite? Is it linear or sRGB? If you want to change the settings of an asset importer, you can use the AssetPostprocessor once more.In this case, you’ll want to use the OnPreprocessTexture message, which is called right before importing a texture. This allows you to change the settings of the importer.When it comes to selecting the right settings for every texture, you need to verify what type of textures you’re working with – which is exactly why labels are key in the first step.With this information, you can write a simple TexturePreprocessor:It’s important to ensure that you only run this for textures that have the art label (our own textures). You’ll then get a reference to the importer so that you can set everything up – starting with the texture size.The AssetPostprocessor has a context property from which you can determine the target platform. As such, you can complete platform-specific changes, like setting the textures to a lower resolution for mobile:Next, check the label to see if the texture is a UI texture, and set it accordingly:For the rest of the textures, set the values to a default. It’s worth noting that Albedo is the only texture that will have sRGB enabled:Thanks to the above script, when you drag and drop the new textures into the Editor, they will automatically have the right settings in place.“Channel packing” refers to the combination of diverse textures into one by using the different channels. It is common and offers many advantages. For instance, the value of the Red channel is metallic and the value of the Green channel is its smoothness.However, combining all textures into one requires some extra work from the art team. If the packing needs to change for some reason (i.e., a change in the shader), the art team will have to redo all the textures that are used with that shader.As you can see, there’s room for improvement here. The approach that I like to use for channel packing is to create a special asset type where you set the “raw” textures and generate a channel-packed texture to use in your materials.First, I create a dummy file with a specific extension and then use a Scripted Importer that does all the heavy lifting when importing that asset. This is how it works:The importers can have parameters, such as the textures you need to combine.From the importer, you can set the textures as a dependency, which allows the dummy asset to be reimported every time one of the source textures changes. This lets you rebuild the generated textures accordingly.The importer has a version. If you need to change the way that textures are packed, you can modify the importer and bump the version. This will force a regeneration of all the packed textures in your project and everything will be packed in the new way, immediately.A nice side effect of generating things in an importer is that the generated assets only live in the Library folder, so it doesn’t fill up your version control.To implement this, create a ScriptableObject that will hold the created textures and serve as the result of the importer. In the example, I called this class TexturePack.With this created, you can begin by declaring the importer class and adding the ScriptedImporterAttribute to define the version and extension associated with the importer:In the importer, declare the fields you want to use. They will appear in the Inspector, just as MonoBehaviours and ScriptableObjects do:With the parameters ready, create new textures from the ones you have set as parameters. Note, however, that in the Preprocessor (from the previous section), we set isReadable to True to do this.In this prototype, you’ll notice two textures: the Albedo, which has the Albedo in the RGB and a mask for applying the player color in the Alpha, and the Mask texture, which includes the metallic in the Red channel and the smoothness in the Green channel.While this is perhaps outside the scope of this article, let’s look at how to combine the Albedo and the player mask as an example. First, check to see if the textures are set, and if they are, get their color data. Then set the textures as dependencies using AssetImportContext.DependsOnArtifact. As mentioned above, this will force the object to be recalculated if any of the textures end up changing.You also need to create a new texture. To do this, get the size from the TexturePreprocessor that you created in the previous section so that it follows the preset restrictions:Next, fill in all the data for the new texture. This could be massively optimized by using Jobs and Burst (but that would require an entire article on its own). Here we’ll use a simple loop:Set this data in the texture:Now, you can create the method for generating another texture in a very similar way. Once this is ready, create the main body of the importer. In this case, we’ll only create the ScriptableObject that holds the results, creates the textures, and sets the result of the importer through the AssetImportContext.When you write an importer, all of the assets generated must be registered using AssetImportContext.AddObjectToAsset so that they appear in the project window. Select a main asset using AssetImportContext.SetMainObject. This is what it looks like:The only thing left to do is to create the dummy assets. As these are custom, you can’t use the CreateAssetMenuattribute. You must make them manually instead.Using the MenuItem attribute, specify the full path to the create the asset menu, Assets/Create. To create the asset, use ProjectWindowUtil.CreateAssetWithContent, which generates a file with the content you’ve specified and allows the user to input a name for it. It looks like this:Finally, create the channel-packed textures.Most projects use custom shaders. Sometimes they’re used to add extra effects, like a dissolve effect to fade out defeated enemies, and other times, the shaders implement a custom art style, like toon shaders. Whatever the use case, Unity will create new materials with the default shader, and you will need to change it to use the custom shader.In this example, the shader used for units has two added features: the dissolve effect and the player color (red and blue in the video prototype). When implementing these in your project, you must ensure that all the buildings and units use the appropriate shader.To validate that an asset matches certain requirements – in this case, that it uses the right shader – there is another useful class: the AssetModificationProcessor. With AssetModificationProcessor.OnWillSaveAssets, in particular, you’ll be notified when Unity is about to write an asset to disk. This will give you the opportunity to check if the asset is correct and fix it before it’s saved.Additionally, you can “tell” Unity not to save the asset, which is effective for when the problem you detect cannot be fixed automatically. To accomplish this, create the OnWillSaveAssets method:To process the assets, check whether they are materials and if they have the right labels. If they match the code below, then you have the correct shader:What’s convenient here is that this code is also called when the asset is created, meaning the new material will have the correct shader.As a new feature in Unity 2022, we also have Material Variants. Material Variants are incredibly useful when creating materials for units. In fact, you can create a base material and derive the materials for each unit from there – overriding the relevant fields (like the textures) and inheriting the rest of the properties. This allows for solid defaults for our materials, which can be updated as needed.Importing animations is similar to importing textures. There are various settings that need to be established, and some of them can be automated.Unity imports the materials of all the FBX (.fbx) files by default. For animations, the materials you want to use will either be in the project or in the FBX of the mesh. The extra materials from the animation FBX appear every time you search for materials in the project, adding quite a bit of noise, so it’s worth disabling them.To set up the rig – that is, choosing between Humanoid and Generic, and in cases where we are using a carefully setup avatar, assigning it – apply the same approach that was applied to textures. But for animations, the message you’ll use is AssetPostprocessor.OnPreprocessModel. This will be called for all FBX files, so you need to discern animation FBX files from model FBX files.Thanks to the labels you set up earlier, this shouldn’t be too complicated. The method starts much like the one for textures:Next up, you’ll want to use the rig from the mesh FBX, so you need to find that asset. To locate the asset, use the labels once more. In the case of this prototype, animations have labels that end with “animation,” whereas meshes have labels that end with “model.” You can complete a simple replacement to get the label for your model. Once you have the label, find your asset using AssetDatabase.FindAssets with “l:label-name.”When accessing other assets, there’s something else to consider: It’s possible that, in the middle of the import process, the avatar has not yet been imported when this method is called. If this occurs, the LoadAssetAtPath will return null and you won’t be able to set the avatar. To work around this issue, set a dependency to the path of the avatar. The animation will be imported again once the avatar is imported, and you will be able to set it there.Putting all of this into code will look something like this:Now you can drag the animations into the right folder, and if your mesh is ready, each one will be set up automatically. But if there isn’t an avatar available when you import the animations, the project won’t be able to pick it up once it’s created. Instead, you’ll need to reimport the animation manually after creating it. This can be done by right-clicking the folder with the animations and selecting Reimport.You can see all of this in the sample video below.Using exactly the same ideas from the previous sections, you’ll want to set up the models you are going to use. In this case, employ AssetPostrocessor.OnPreprocessModel to set the importer settings for this model.For the prototype, I’ve set the importer to not generate materials (I will use the ones I’ve created in the project) and checked whether the model is a unit or a building (by verifying the label, as always). The units are set to generate an avatar, but the avatar creation for the buildings is disabled, as the buildings aren’t animated.For your project, you might want to set the materials and animators (and anything else you want to add) when importing the model. This way, the Prefab generated by the importer is ready for immediate use.To do this, use the AssetPostprocessor.OnPostprocessModel method. This method is called after a model is finished importing. It receives the Prefab that has been generated as a parameter, which lets us modify the Prefab however we want.For the prototype, I found the material and Animation Controller by matching the label, just as I located the avatar for the animations. With the Renderer and Animator in the Prefab, I set the material and the controller as in normal gameplay.You can then drop the model into your project and it will be ready to drop into any scene. Except we haven’t set any gameplay-related components, which I’ll address in the second part of this blog.With these advanced scripting tips, you’re just about game ready. Stay tuned for the next installment in this two-part Tech from the Trenches article, which will cover hacks for balancing game data and more.If you would like to discuss the article, or share your ideas after reading it, head on over to our Scripting forum. You can also connect with me on Twitter at @CaballolD.

>access_file_
1102|blog.unity.com

It’s here: The complete overview of Unity toolsets and workflows for technical artists

As longtime Unity creators know, we regularly share updates and feature improvements, alongside tips and best practices, across multiple channels: on our blog, in the forums, and at events. This open, multichannel dialogue is a central part of our community’s roots.Sometimes, however, it’s nice to have a complete overview, or inventory, of what’s available for your specific expertise or area of interest. That’s what our new e-book, Unity for technical artists: Key toolsets and workflows, aims to provide for experienced and technical artists alike. While the original version of the e-book was based on the 2020 LTS, this latest iteration reflects what’s available in 2021 LTS.The first of its kind, this new guide compiles detailed summaries of all Unity systems, features, and workflows for experienced technical artists. Use it as both a source of inspiration and a reference for accessing more advanced creator content to expand your skill set.Through compact yet visually rich sections, Unity for technical artists highlights the vast possibilities for graphical quality and breadth of style that you can realize with Unity.But inspiration is not our only goal here. Each section includes links to instructional, in-depth resources, so you can learn how to use the toolsets that are most important to you, your work, and career path.Based on feedback from individual creators and professional teams we’ve worked with, technical artists are expected to have a broad understanding of what’s possible to achieve on various target platforms with the Digital Content Creation (DCC) tools and game engine they’re using. In light of this knowledge, they inform the art director and other artists of any limitations and opportunities surrounding the target hardware.Many technical artists address their team’s most complex artistic needs, from character rigging to writing shaders, or proposing new workflows and creation tools to accelerate their processes. Overall, they play a critical role in ensuring that the visual quality of a game or other application meets the standard set by their team.Unity for technical artists spans a multitude of toolsets, pipelines, and workflows, reflecting this range of expertise required of technical artists.The e-book serves as a useful resource for users who want to expand their skills in Unity. Maybe you’re a programmer looking to specialize in graphics programming, a designer who wants to refine game content by scripting interactivity in Unity, or an artist learning to create shaders either through scripting or with the Shader Graph.Keep this e-book handy for onboarding new team members – those who have worked with Unity in a limited way previously, or those who’ve worked with a different engine entirely. This guide will help them pinpoint the Unity tools and related learning resources that can benefit their creative work.Let’s take a look at some of the major sections in the e-book.The chapters on assets cover topics such as building a non-destructive asset pipeline, importing assets, roundtripping with DCC tools, and the Asset Database.In this guide, we review the latest capabilities of the Universal Render Pipeline (URP) and the High Definition Render Pipeline (HDRP), as well as pointers on how to choose the best rendering path for your particular project. Other topics covered include dynamic resolution and upscaling methods.Additionally, we unpack the lighting workflows used to simulate Global Illumination (GI) with the Progressive, CPU, and GPU Lightmappers, as well as differences between Real-time GI, Ray-traced GI, and Enlighten.Unity provides a complete set of tools for building and designing rich and scalable 3D and 2D worlds. These chapters dive into key workflows for grey-boxing levels with ProBuilder and Polybrush, while showcasing the latest iteration of the Terrain sculpting tools, and sharing details on how to create sky, cloud, and fog visuals in URP and HDRP.Visual Scripting comprises visual, node-based graphs that non-programmers can use to design final logic and create quick prototypes. An introduction to Unity’s Visual Scripting system explains how you can use it to define game logic for your Unity projects without writing traditional code.An extensive appendix outlines the process of creating digital humans for the Unity demos The Heretic and Enemies. From data capture and processing to creating the skin, eyes, and hair visuals, this section discloses how such effects were achieved.There’s much more to discover in the e-book, including sections on the animation system, creating cutscenes and cinematics, and the 2D toolset.We hope that you enjoy this latest technical guide. We encourage you to share your feedback on the forum.For more advanced content, you can browse our recently published How-to hub, which gathers Unity e-books, instructional articles, documentation, and more, all in one place.

>access_file_
1103|blog.unity.com

Level up your code with game programming patterns

If you have experience with object-oriented programming languages, then you’ve likely heard of the SOLID principles, MVP, singleton, factory, and observer patterns. Our new e-book highlights best practices for using these principles and patterns to create scalable game code architecture in your Unity project.For every software design issue you encounter, a thousand developers have been there before. Though you can’t always ask them directly for advice, you can learn from their decisions through design patterns.By implementing common, game programming design patterns in your Unity project, you can efficiently build and maintain a clean, organized, and readable codebase, which in turn, creates a solid foundation for scaling your game, development team, and business.In our community, we often hear that it can be intimidating to learn how to incorporate design patterns and principles, such as SOLID and KISS, into daily development. That’s why our free e-book, Level up your code with game programming patterns, explains well-known design patterns and shares practical examples for using them in your Unity project.Written by internal and external Unity experts, the e-book is a resource that can help expand your developer’s toolbox and accelerate your project’s success. Read on for a preview of what the guide entails.Design patterns are general solutions to common problems found in software engineering. These aren’t finished solutions you can copy and paste into your code, but extra tools that can help you build larger, scalable applications when used correctly.By integrating patterns consistently into your project, you can improve code readability and make your codebase cleaner. Design patterns not only reduce refactoring and the time spent testing, they speed up onboarding and development processes.However, every design pattern comes with tradeoffs, whether that means additional structures to maintain or more setup at the beginning. You’ll need to do a cost-benefit assessment to determine if the advantage justifies the extra work required. Of course, this assessment will vary based on your project.KISS stands for “keep it simple, stupid.” The aim of this principle is to avoid unnecessary complexity in a system, as simplicity helps drive greater levels of user acceptance and interaction.Note that “simple” does not equate to “easy.” Making something simple means making it focused. While you can create the same functionality without the patterns (and often more quickly), something fast and easy doesn’t necessarily result in something simple.If you’re unsure whether a pattern applies to your particular issue, you might hold off until it feels like a more natural fit. Don’t use a pattern because it’s new or novel to you. Use it when you need it.It’s in this spirit that the e-book was created. Keep the guide handy as a source of inspiration for new ways of organizing your code – not as a strict set of rules for you to follow.Now, let’s turn to some of the key software design principles.SOLID is a mnemonic acronym for five core fundamentals of software design. You can think of them as five basic rules to keep in mind while coding, to ensure that object-oriented designs remain flexible and maintainable.The SOLID principles were first introduced by Robert C. Martin in the paper, Design Principles and Design Patterns. First published in 2000, the principles described are still applicable today, and to C# scripting in Unity:Single responsibility states that each module, class, or function is responsible for one thing and encapsulates only that part of the logic.Open-closed states that classes must be open for extension but closed for modification; that means structuring your classes to create new behavior without modifying the original code.Liskov substitution states that derived classes must be substitutable for their base class when using inheritance.Interface segregation states that no client should be forced to depend on methods it does not use. Clients should only implement what they need.Dependency inversion states that high-level modules should not import anything directly from low-level modules. Both should depend on abstractions.In the e-book, we provide illustrated examples of each principle with clear explanations for using them in Unity. In some cases, adhering to SOLID can result in additional work up front. You may need to refactor some of your functionality into abstractions or interfaces, but there is often a payoff in long-term savings.The principles have dominated software design for nearly two decades at the enterprise level because they’re so well-suited to large applications that scale. If you’re unsure about how to use them, refer back to the KISS principle. Keep it simple, and don’t try to force the principles into your scripts just for the sake of doing so. Let them organically work themselves into place through necessity.If you’re interested in learning more, check out the SOLID presentation from Unite Austin 2017 by Dan Sagmiller of Productive Edge.What’s the difference between a design principle and a design pattern? One way to answer that question is to consider SOLID as a framework for, or a foundational approach to, writing object-oriented code. While design patterns are solutions or tools you can implement to avoid everyday software problems, remember that they’re not off-the-shelf recipes – or for that matter, algorithms with specific steps for achieving specific results.A design pattern can be thought of as a blueprint. It’s a general plan that leaves the actual construction up to you. For instance, two programs can follow the same pattern but involve very different code.When developers encounter the same problem in the wild, many of them will inevitably come up with similar solutions. Once a solution is repeated enough times, someone might “discover” a pattern and formally give it a name.Many of today’s software design patterns stem from the seminal work, Design Patterns: Elements of Reusable Object-Oriented Software by Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides. This book unpacks 23 such patterns identified in a variety of day-to-day applications.The original authors are often referred to as the “Gang of Four” (GoF),and you’ll also hear the original patterns dubbed the GoF patterns. While the examples cited are mostly in C++ (and Smalltalk), you can apply their ideas to any object-oriented language, such as C#.Since the Gang of Four originally published Design Patterns in 1994, developers have since established dozens more object-oriented patterns in a variety of fields, including game development.While you can work as a game programmer without studying design patterns, learning them will help you become a better developer. After all, design patterns are labeled as such because they’re common solutions to well-known problems.Software engineers rediscover them all the time in the normal course of development. You may have already implemented some of these patterns unwittingly.Train yourself to look for them. Doing this can help you:Learn object-oriented programming: Design patterns aren’t secrets buried in an esoteric StackOverflow post. They are common ways to overcome everyday hurdles in development. They can inform you of how many other developers have approached the same issue – remember, even if you’re not using patterns, someone else is.Talk to other developers: Patterns can serve as a shorthand when trying to communicate as a team. Mention the “command pattern” or “object pool” and experienced Unity developers will know what you’re trying to implement.Explore new frameworks:When you import a built-in package or something from the Asset Store, inevitably you’ll stumble onto one or more patterns discussed here. Recognizing design patterns will help you understand how a new framework operates, as well as the thought process involved in its creation.As indicated earlier, not all design patterns apply to every game application. Don’t go looking for them with Maslow’s hammer; otherwise, you might only find nails.Like any other tool, a design pattern’s usefulness depends on context. Each one provides a benefit in certain situations and also comes with its share of drawbacks. Every decision in software development comes with compromises.Are you generating a lot of GameObjects on the fly? Does it impact your performance? Can restructuring your code fix that? Be aware of these design patterns, and when the time is right, pull them from your gamedev bag of tricks to solve the problem at hand.In addition to the Gang of Four’s Design Patterns, Game Programming Patterns by Robert Nystrom is another standout resource, currently available for free as a web-based edition. The author details a variety of software patterns in a no-nonsense manner.In our new e-book, you can dive into the sections that explain common design patterns, such as factory, object pool, singleton, command, state, and observer patterns, plus the Model View Presenter (MVP), among others. Each section explains the pattern along with its pros and cons, and provides an example of how to implement it in Unity so you can optimize its usage in your project.Unity already implements several established gamedev patterns, saving you the trouble of writing them yourself. These include:Game loop: At the core of all games is an infinite loop that must function independently of clock speed, since the hardware that powers a game application can vary greatly. To account for computers of different speeds, game developers often need to use a fixed timestep (with a set frames-per-second) and a variable timestep where the engine measures how much time has passed since the previous frame. Unity takes care of this, so you don’t have to implement it yourself. You only need to manage gameplay using MonoBehaviour methods like Update, LateUpdate, and FixedUpdate. Update: In your game application, you’ll often update each object’s behavior one frame at a time. While you can manually recreate this in Unity, the MonoBehaviour class does this automatically. Use the appropriate Update, LateUpdate, or FixedUpdate methods to modify your GameObjects and components to one tick of the game clock.Prototype: Often you need to copy objects without affecting the original. This creational pattern solves the problem of duplicating and cloning an object to make other objects similar to itself. This way you avoid defining a separate class to spawn every type of object in your game. Unity’s Prefab system implements a form of prototyping for GameObjects. This allows you to duplicate a template object complete with its components. Override specific properties to create Prefab Variants or nest Prefabs inside other Prefabs to create hierarchies. Use a special Prefab editing mode to edit Prefabs in isolation or in context. Component:Most people working in Unity know this pattern. Instead of creating large classes with multiple responsibilities, build smaller components that each do one thing. If you use composition to pick and choose components, you can combine them for complex behavior. Add Rigidbody and Collider components for physics, or a MeshFilter and MeshRenderer for 3D geometry. Each GameObject is only as rich and unique as its collection of components.Both the e-book and a sample project on the use of design patterns are available now to download for free. Review the examples and decide which design pattern best suits your project. As you gain experience with them, you’ll recognize how and when they can enhance your development process. As always, we encourage you to visit the forum thread and let us know what you think of the e-book and sample.

>access_file_
1104|blog.unity.com

5 ways to level up your game development skills

Are you looking to improve your gamedev skills? You’ve come to the right place. Below, we’ve curated a selection of courses and projects that are popular with our developer community and guide you through engaging, gamified learning experiences. Read on to discover five ways to reach new heights with your next project, no matter your skill level.This course is an exciting learning experience for intermediate creators, guiding you through the creation of your own game while emphasizing an inclusive design process focused on accessibility. We created Out of Circulation – a vertical slice of a point-and-click adventure game – as a case study, and we’ll explore how it came together as you work through the course tutorials.Take the course.This Unity Learn project is the second part of Introduction to Visual Scripting, so complete that first if you haven’t already! Here, you’ll develop several visual scripts to add a new level to Clive the Cat, a Sokoban-style game that follows the titular cat as he navigates a maze-like crypt in search of his food dish. Along the way, you’ll learn intermediate and advanced techniques in visual scripting, including super units, events, and State Machines.Start the project.In the John Lemon’s Haunted Jaunt project, you’ll discover how to create a stealth game as you work through 10 tutorials that explain the principles behind every step. This project is a great way for beginners to gain confidence in the Unity Editor and learn the foundations of game design.Start the project.This intermediate course on Unity Learn guides you through the essentials of C# scripting. Skills you’ll learn along the way include application scripting, programming theory, and code optimization, all of which will allow you to take your creativity to the next level.Take the course.If you understand the Unity basics and want to try out VR development, the VR Beginner: The Escape Room project gives you the opportunity to explore and create simple immersive experiences supported by the XR interaction Toolkit package.Take the project.Looking for more ways to boost your gamedev know-how? Check out the full Unity Learn catalog and keep your eyes peeled for future courses offering guidance on new skills.

>access_file_
1105|blog.unity.com

Learn, connect, and celebrate at the Unity for Humanity Summit

The Unity for Humanity Summit is back for its third year on Wednesday, November 2, 2022. Join us for a virtual celebration of real-time 3D creators using Unity to change the world – registration is free and open to all.This year’s Summit includes an inspiring keynote conversation with actor and activist Rosario Dawson, who will discuss her sustainable fashion and nonprofit work, why positivity breeds innovation, and how technology can bring us all together around the virtual fire.In addition, you can attend a wealth of sessions led by artists, game developers, educators, nonprofit leaders, activists, and other changemakers across the core themes of sustainability, education, health and wellbeing, and inclusivity. Here are just a few of the exciting sessions on the agenda:Players first: Accessibility insights for creators features Dave Evans, Jazmin Cano, and Lukáš Hosnedl in a conversation about prioritizing accessibility in game development and the future of accessible gaming.From imagining to creating better worlds in RT3D brings together visionary creators Binh Minh Herbst, Katerina Cizek, Kathryn Evans, and Tony Patrick to discuss the process of translating ambitious visions into real-world action and impact.Create with AR Live and Unity Essentials Live showcase hands-on tools and resources for getting started in Unity and developing mobile AR experiences, led by Unity Learn team members Joy Horvath and Thomas Winkley.Activate your players: Nudging sustainable behavior and driving engagement provides insight into how game developers Hunter Bulkeley, Sheila Ndungu, Jude Ower, Galina Fedulova, and Jens Isensee are driving sustainability and empowering players to protect the planet through their games.Leveraging digital twins for sustainability in the built environment spotlights how organizations across the nonprofit, public, and private sectors are using real-time 3D for positive impact, hosted by Dr. Max Mallia-Parfitt, Dr. Sarah Whateley, and Ursula Smolka.Crafting Heroes: Students building the future using HoloLens 2 with Deidre LaCour and Sean Wybrant takes you straight into the inspiring XR work students are creating.Check out the full schedule and register now.Conversations with Creators are unique opportunities to meet fellow creators, ask them your burning questions, and immerse yourself in their stories, projects, and practical advice. Topics for these live Q&A sessions include:Addressing the youth mental health crisis with Cleo Barnett, co-creative director of AmplifierImmersive storytelling and impact with Gone to Water team Cat Ross and Marin VeselyThe world of immersive theater with Ferryman Collective cofounders Deirdre V. Lyons and Stephen ButchkoUnity for Humanity technical support Q&A with Andy Ellis, software development consulting manager at UnityDigital health and wellbeing with Gabriel G. Torres, creator of Haus of DustAR-tivism with Damien McDuffie, creator of Black TerminusPersonal storytelling for impact with Ondřej Moravec, Hana Blaha Šilarová, and Robin Pultera of DarkeningApplications for the Unity for Humanity 2023 Grant will open during the Summit, and we’ll be sharing tips for applying for the $500,000 available to bring your real-time 3D projects to life. Be sure to tune into the Unity for Humanity Grant Q&A session during the Summit to ask our program leads all of your questions, and sign up for the Social Impact mailing list to be notified of future grant announcements.Register now and join us at the Unity for Humanity Summit on November 2 to connect with social impact creators from across the globe and be part of the movement working towards a more equitable, inclusive, and sustainable world.

>access_file_
1106|blog.unity.com

Metaverse Minute: 4 sales and marketing trends you can jump on now

With each evolution of communication, marketing and sales tactics have changed. The rise of print media gave us print ads, radio gave us radio ads, and this trend continued with television, podcasts, etc. So what does this mean for the metaverse? Think immersive, interactive, and customizable. In this edition of the Metaverse Minute, we break down some applications of Unity that could be a window into the future of marketing and sales techniques. Let’s see if we can sell you on them.The challenge of convincing customers to buy increases with price, making big ticket items a harder sell. For this reason, we see experiential metaverse technologies empowering luxury items and travel to sell themselves.If the price is $450,000, it had better be a good pitch! This is one of the reasons Virgin Galactic hired global innovation company Seymourpowell to design the world’s first commercial spaceship.Since design and development of the spaceship took place during the pandemic, virtual reality was critical to the project. A digital twin of the cabin was created so prototyping and production of the spaceship could stay on schedule.The usage of the digital twin didn’t end there. As Virgin Galactic began prepping for their marketing campaign they realized the best way to sell seats was to let people sit in them. The team from Seymourpowell converted the digital twin into an experience, and this sales tactic did not disappoint.Cars aren’t something that people typically buy more than one of, and once purchased they’re used for a long time before being replaced. This makes building a personal connection with customers a priority for automotive manufacturers. Visionaries 777 Ltd. created an experience for INFINITI that did just that.Customers start at an iPad station where they answer a set of personality questions and create a photo-based avatar of themselves. After choosing a setting, their avatars were placed next to a configured car that was best suited to their personality. Users picture themselves in their future car with a 360º camera, snapping a selfie.As a kid did you imagine your “dream house?” Remember how fun it was to try and visualize the space? Ordering kitchen cabinets and furniture as an adult is much more complicated. Taking the right measurements and working around physical boundaries adds a nice dose of reality to what you once thought of as a fun game of imagination. Virtual technologies have the potential to recapture the childhood experience of limitless creativity for homeowners once again. For this reason, we love what Volumiq and VOXBOX are doing.Imagine making that dream house plan once again and then converting it to a 3D blueprint with color and texture. Utopiq is working to make this possible. The team wants to simplify the process of home renovations into deciding on a shopping list, visualizing in 3D, and then placing your order.The VOX brand has created their own solution for the products their sales teams are using in their showrooms. Customers are bringing their home to the store. This makes recommending design decisions and products much easier and creates a compelling reason for purchases.Games offer a different, and sometimes more compelling, way for brands to build loyalty. Rather than being static, games offer an interactive way for marketers to engage with customers and give them an experience that sticks with them.This unique experience put together by Groove Jones demonstrates the power of Medtronic’s GI Genius™ and PillCam™ systems. The GI Genius™ intelligent endoscopy module is the first-to-market, computer-aided polyp detection system powered by artificial intelligence (AI). The PillCam™ COLON capsule endoscopy system enables direct visualization of the colon with a noninvasive capsule endoscopy procedure, supporting early detection of polyps. This game educates potential customers how AI-powered systems can improve the early detection of precancerous polyps along the lower digestive tract.In an era where anyone can order anything online, what differentiates one brand from another? Online shopping has put immense pressure on retailers to develop a competitive edge. This is why Mitchell Harvey of Deckers told us his team turned to real-time 3D for product rendering. “There are no limitations to what you can explore to communicate your brand and offer an unforgettable experience to your consumer.”The team from Smartpixels found the same thing when Church’s reached out to them to create a configurator for the 75th anniversary of their Consul shoe. If a product looks photorealistic online, you’re way more likely to buy it. If you’re able to customize a product to your tastes, that increases the likelihood of purchase even further. This held true as real-time 3D and photorealistic rendering allowed Church’s to make 35% more per purchase.Are you using Unity for sales and marketing?If you’re using Unity to get people excited about your products, give us a shout on Twitter.Follow Unity for Digital Twins on Twitter, LinkedIn, and Instagram. For more details on Unity for Digital Twins check out our recent demo or our new kickstarter package

>access_file_
1107|blog.unity.com

Announcing the Higher Ed XR Innovation Grant recipients

The demand for extended reality (XR) talent is increasing rapidly, opening countless new doors for the next generation of metaverse creators. To adequately prepare tomorrow’s real-time 3D workforce, educators and schools need to be teaching these desirable skill sets to their students today. In pursuit of this goal, Unity Social Impact and Meta Immersive Learning have partnered to increase access to AR/VR hardware, high-quality educational content, and other resources that will help educators create or enhance innovative XR programs.The Higher Ed XR Innovation Grant is one of the core components of this partnership, providing over $1 million in awards to higher-education institutions leveraging real-time 3D and immersive technology to make advances in teaching and learning, XR creation, and workforce development.Grantees were selected based on their proposals’ attention to inclusion, impact, viability, and innovation. Special consideration was given to institutions and programs that cater to or design innovative educational content for underserved learners.Today, we’re thrilled to introduce the eight recipients of the Higher Ed XR Innovation Grant. A team of over 60 judges from Unity and Meta selected the winners from among 276 submissions.“Now, more than ever, we have a responsibility to equip young people with the skills necessary for future jobs – providing them with learning that translates to earning,” says Jessica Lindl, vice president of social impact at Unity. “I’m thrilled with the winners of the Higher Ed XR Innovation Grant and am confident that these institutions will continue to provide equitable access to education and workforce opportunities.”Read on to learn how these forward-thinking projects are increasing access to quality real-time 3D education.Arizona State University’s Center for Narrative and Emerging Media (NEM) in Los Angeles will open as a best-in-class teaching and research facility, focused on diversifying who can create and distribute narratives using emerging media technologies in the areas of arts, culture, and nonfiction.NEM will train and support storytellers, artists, journalists, entrepreneurs, and engineers who will build the stories, technologies, and policies of the future. This fall, ASU launched their flagship MA Narrative and Emerging Media program, a collaborative effort between the Herberger Institute for Design and the Arts and the Walter Cronkite School of Journalism and Mass Communication centered around the development of a creative practice and critical understanding of emerging storytelling and immersive content creation. Funding from the Higher Ed XR Innovation Grant will support student production, virtual production, staff training, and research.Country: United States of AmericaThe vision of California Community Colleges is to ensure students from all backgrounds succeed in reaching their education and career goals, with emphasis on improving families’ incomes and communities’ workforces. To achieve this, California Community Colleges aim to provide educational programs that highlight inclusivity, diversity, and equity while minimizing logistical and financial barriers to success.Cañada College will partner with various employers and California Community College districts to enhance XR apprenticeship programs, K–14 curriculum development, and XR job training programs designed for dislocated workers, workforce board clients, and underemployed individuals. Funds from the Higher Ed XR Innovation Grant will aid Cañada College in designing and sharing workforce readiness models for county education offices, community colleges and universities, and workforce training entities in California and throughout the U.S.Country: United States of AmericaThe Clarkson University Psychology Department plans to use their grant funding to develop a novel instructional tool that leverages both VR for accurate neuroanatomical renderings and modern pedagogical principles (such as social interaction and embodiment) to build an innovative and engaging neuroscience learning experience.By using this tool to enhance their psychology program’s neuroscience instruction and open-sourcing the tool for use at other institutions, Clarkson University hopes to positively impact psychology students – especially those from underrepresented backgrounds. And, since student training is integrated throughout the project, the development process will involve students from multiple departments, providing them with opportunities to work in VR, engage in usability testing, and learn about neuroscience.Country: United States of AmericaThe LED (Digital Experience Lab) at the Federal University of Ceará (UFC) Department of Architecture, Urbanism and Design (DAUD) aims to be a place for students and professors to explore new digital technologies and develop innovative solutions for real-world problems.The LED plans to use their Higher Ed XR Innovation Grant funding to outfit eight digital studios with hardware for running prototyping experiments in virtual environments. They’ll also acquire peripherals for interacting with AR and MR experiences, including projectors for SAR (spatial AR), Kinect sensors, and motion trackers, with the goal of exploring ways that XR technology can improve design education and project solutions. Finally, their team will drive development of LEDed, a free platform for sharing educational content and experiences in XR within the department’s community and beyond.Country: BrazilNorQuest College is Alberta’s largest community college, serving more than 21,000 students annually. Housed within the Research Office at NorQuest College, Autism CanTech! (ACT!)’s vision is to remove barriers that hinder meaningful and sustainable employment within the digital economy for individuals on the Autism spectrum.Through job-specific training in technical skills, employability skills, career coaching, and Work Integrated Learning (WIL) opportunities, ACT! works to fill industry gaps. ACT! also offers participants additional support through new assistive technology which allows users, career coaches, and supervisors to manage tasks, schedule work-related activities, and live chat. Funds from the Higher Ed XR Innovation Grant will support the development of a road map and team to adapt educator resources and XR courses for a neurodiverse audience.Country: CanadaEthọ́s Lab is a Black-led, nonprofit innovation academy for teens based in Vancouver, British Columbia, and accessible from anywhere in the world. To build toward a more inclusive future, Ethọ́s Lab takes a holistic, community-based approach to teaching S.T.E.A.M. that is partnered and has a long-term vision. The organization provides pathways to applied learning, mentorship, and access to emerging tech through weekly collaborative workshops, creative projects, and events. As participants, youth develop core skills for post-secondary admissions, future careers, and being the leaders of innovation.Centre for Digital Media (CDM) was established in 2007 through a ground-breaking partnership between the University of British Columbia, Simon Fraser University, British Columbia Institute of Technology, and Emily Carr University of Art + Design. Anchored by the flagship, multidisciplinary Master of Digital Media program, CDM is a mixed-use campus, home to Canada’s first Metastage studio as well as game studios and innovative startups in the healthcare and cloud-computing sectors.With the support of the Higher Ed XR Innovation Grant, Ethọ́s Lab and CDM aim to increase the representation of Black youth and girls in XR-based digital futures through development of an XR Media Lab program. The funding will enable the program to serve 300+ underrepresented youth and 190+ high school educators over five years.Country: CanadaUniversidad de los Andes was founded in 1948 as an autonomous and innovative institution pursuing pluralism, tolerance, and respect. It strives to raise consciousness about students’ social and civic responsibilities as well as their relationship to and stewardship of the environment.The XR Incubator Program (named “Vivero Virtual” in Spanish) is a two-year program focused on both workforce development and education innovation in XR. With Higher Ed XR Innovation Grant funding, Universidad de los Andes will launch three Massive, Open Online Courses (MOOCs) in Spanish to promote learning throughout Ibero-America. Funds will also help implement a week-long XR Camp offering Colombian educators access to a variety of XR technologies, and an XR Mobile Lab that will allow those educators to show XR technologies to the public at their own institutions.Country: ColombiaThe University of Johannesburg (UJ) and the Swiss Distance University of Applied Sciences (FFHS) aspire to design an innovative, immersive tool that addresses challenges faced by teachers in underrepresented communities. With Higher Ed XR Innovation Grant funding, their multinational team will develop and test a VR prototype for pre-service teachers (student teachers working towards their teacher certification) in South Africa.The tool will allow future teachers to have authentic teaching experiences in a safe environment, aided by learning analytics that provide opportunities to reflect on their lesson delivery and prepare them for actual teaching. The University also intends the tool to help mitigate language barriers for students whose first language is not English. Broadly, the project will empower pre-service teachers to be agents in transforming science teaching, leveraging the potential of immersive technologies and preparing students from marginalized communities with 21st-century digital skills.Country: South AfricaOn behalf of Unity Social Impact and Meta Immersive Learning, congratulations to all of our grant recipients and thank you to everyone who applied for the Higher Ed XR Innovation Grant. Learn more about educator resources and tools for propelling real-time 3D in the classroom and Meta’s $150 million investment to transform the way we learn through Meta Immersive Learning.

>access_file_
1109|blog.unity.com

Games Focus: Profiling and performance optimization

This is the third blog in our Games Focus series, which highlights key product development initiatives for the year ahead and beyond. Here, we cover the status, upcoming release plans, and future vision for profiling and performance at Unity.My name is Marika. I’ve worked in the video game industry for nearly a decade, and I’m currently the senior technical product manager for profiling tools and performance optimization at Unity.When we think about performance at Unity, there are three main areas where we believe we can help:Insight: Empowering you to dig into your game’s performance, identify bottlenecks, and pinpoint areas that would benefit from optimization strategiesExperience: Ensuring that creating your projects feels seamless by raising the performance of the Unity Editor and runtimeInnovation: Guiding you in new programming techniques and paradigms that have performance in mind from the ground upToday’s post focuses on the first two areas, covering recent updates to our suite of profiling tools and how customer feedback is shaping our roadmap for 2023 and beyond. It ends with a recap on best practices for optimizing projects.We’ll dive into the third area, performance-focused programming techniques and workflows with a focus on DOTS-based projects, in an upcoming post.We like to think of the profiling features in Unity as detective tools that help you unravel the mysteries of why performance in your application is slow or glitchy, or why code is allocating excess memory. They help you understand what’s going on under the hood of the Unity game engine.Our goal is to raise the performance of Unity tools and runtime through profiling and optimization, helping you to deliver smooth performance for your players across a broad range of platforms and devices.Memory Profiler is designed to make it easier to keep track of memory usage and composition. I’m happy to share that the Memory Profiler 1.0.0 is now a verified package for the Unity 2022.2 beta release and above (find instructions for download here). Thank you to everyone who shared their invaluable feedback on how to create a better workflow for this important feature.Many of you contend with the challenge of working within the memory restrictions of each of your target platforms. The Memory Profiler helps solve this challenge by providing you with a clear overview of the memory impact of assets and objects in one view. It also shows you detailed contextual information on which objects and systems the memory relates to.You can dig deeper into the capture through breakdown views and compare memory snapshots to identify potential leaks and unnecessary allocations that negatively impact memory usage.Refining the existing profiling toolsetIn 2022.1, we added the Frame Timing Manager, which enables you to capture and access frame timing data across multiple frames. If performance is lagging, use this feature to assess frames and analyze why your application isn’t meeting performance targets. Learn more about this in our documentation.If you’re looking to monitor low-level GPU metrics in the Unity Profiler, you can use the new System Metrics Mali package, which we released in 2022.1 through a partnership with Arm. This package allows you to access low-level system or hardware metrics on mobile devices. If you’re curious to learn more about how to ensure your content runs smoothly on mobile devices powered by Arm CPUs and Mali GPUs, this 2021 blog can guide you.Performance optimizationThe profiling tools highlighted so far are a great start to helping you identify areas where performance can be improved. On my team, however, “performance optimization” applies not only to your games’ runtime performance on their target devices, but also to how your team works – your productivity. We’re aiming to provide you with faster iteration times, fewer interruptions, and greater efficiency in the Editor.In Unity 2021 LTS, importing your assets is three to four times faster, and opening imported projects up to 8.7% faster compared to Unity 2020 LTS.I’m excited to share some of the improvements that are available in the 2022 releases as well:Improved material reimport for the Universal Render Pipeline and High Definition Render PipelineEditor workflow improvements, including: Faster save time for large scenesReduced stall time in scene pickingImproved performance in the Scene view when there are many LOD GroupsAn optimized animation rigging packageBetter Hierarchy scrollingImproved save workflow for large PrefabsImproved iteration time when working inside the Editor thru optimizations on domain reloadsPlay mode improvements, including: Improved static batching performanceOptimized process for how Addressables finds resource directoriesPrewarming particle systemsAgain, your feedback has played a vital role in many of these improvements. Please continue sharing your feedback on our future roadmap here, or contact the team on the forums. We’re particularly interested in performance-related issues, which we’re capturing here.For over a decade, we’ve stayed focused on ensuring that you can achieve the best performance possible using our profiling tools, and we will continue refining the toolset. This development has taken many forms, including all of the functionality we’ve already mentioned here.Another area where we’ve worked to provide significant performance gains is the Data-Oriented Technology Stack, or DOTS. Two of the core features integral to the successful delivery of DOTS are the Burst compiler and C# Job System. These were leveraged in our own internal engine performance to great results, and they’re available for all today. Coming soon, we’ll deliver on the third critical feature, Entities, which will turbo boost project performance in areas like networking, physics, and more. This is such an important aspect to our commitment to game development that we will dedicate a standalone Games Focus article to DOTS in this series, coming soon.For now, let’s take a look at a few improvements planned for upcoming releases.We’re working to cut time spent starting the Editor, to improve start-up time and help you stay in flow. Today, when you connect a target device to the Editor, you might experience instability such as disconnection or an inability to recover. We’re working on making Editor connections to mobile platforms more reliable and performant in the 2023.1 release.Additionally, we want to make it more efficient for you to identify bottlenecks with the Profiler and to know what to do next once you’ve spotted them. Our goal is to quickly direct you to the areas of optimization that will yield the greatest performance gains.We’re also looking to add memory insights based on the device you’re building for, so you can get platform-specific performance gains. This is in the early stages, and we’re actively looking for your feedback on this new feature, which you can provide on our roadmap page.Expertise with Unity’s suite of profiling tools is one of the most useful skills you can add to your game development toolbox. That’s why we’re working on creating more advanced content about best practices to help you get the most out of our tools.Several of my teammates recently put together our most extensive guide to date about profiling in Unity, in partnership with expert engineers from the Unity Integrated Success team and experts.I also suggest you download these additional advanced e-books that offer extensive platform-specific optimization best practices:Optimize your mobile game performanceOptimize your console and PC game performanceAnother handy reference is this flowchart, which provides a recommended approach to identifying bottlenecks in your project.Finally, this Profiling and optimization reading list, created by our content and marketing teams, includes key blog posts that will help you understand profiling concepts and methods, from basic to advanced.My team is working hard to bring you the solutions you need for your most ambitious projects, and we’re always eager to understand how we can help you better.Stay tuned to updates in our public roadmap page. This is also the best place for you to share feedback directly with the product team.Watch the blog for our next Games Focus update, which will focus on what Unity is doing to help you target more platforms and form factors with your game content. And, as always, share your feedback with us on the forums.

>access_file_
1110|blog.unity.com

4 things multiplayer gamers want in 2022 and beyond

Unity surveyed over 1,400 multiplayer gamers in the US, UK, Japan, and Korea to find out what multiplayer gamers want from their experiences so you can get a head start on planning your next project.Developing multiplayer games can require more up-front work to get running – like specialized expertise, ongoing service, and up-front capital and maintenance costs.In that sense, multiplayer game development is more of an investment than single player game creation – so you need to design the player experience right in order to find success.But what’s most important to multiplayer gamers? We surveyed players across the globe to find out.In each country (US, UK, Japan, Korea), we focused on two categories of multiplayer gamers: Casual and core.Casual gamers: Those who spend at least two hours gaming per week, of which at least 30 minutes is playing multiplayerCore gamers: Those who spend at least four hours playing multiplayer games in any combination of the following traditionally multiplayer genres: Shooter (battle royale, FPS, third person shooter), MOBA, MMO, racing, sports, or fightingWe collected responses from approximately 1,400 gamers, split about 50/50 between the core and casual groups. Here’s what we found.One thing we found is that demand for multiplayer games is massive, and people around the world are playing a lot of multiplayer titles without walled gardens. More than half the global population (52%) play games, and of those gamers, 77% play multiplayer games.Additionally, crossplay is a powerful tool that is helping fuel deeper engagement with multiplayer gaming – with those who spend the most time playing cross-play titles also being the ones spending the most time playing multiplayer games.Features that make playing with others easier are significant factors when gamers are choosing a new multiplayer title to invest their time in, and also impact enjoyment of games they’re actively playing.Players want to easily connect with their friends through shared gaming experiences. When choosing a new multiplayer game, the most important in-game feature is the ability to chat with friends.When it comes to how they like to chat, 52% of respondents prefer creating chat parties with menus inside of their games, while 25% most prefer using a separate device/software solution, and only 15% of respondents preferred using a separate device.This indicates that investing in a smart in-game player comms solution is a good move for your multiplayer titles.Also ranking highly in must-have features for a multiplayer game is a short wait to join a match, with 29% of gamers ranking this in their top three.Tip 💡 Invest in social features – like friends lists, parties, and in-game comms – for your multiplayer title to give reasons a player to choose your game and build a community within it.Most multiplayer gamers stay engaged with games post launch and will spend on content to keep their experiences fresh, with over half of multiplayer gamers (61%) reported having purchased some amount of multiplayer game DLC content in the last thirty days.When it comes to the breakdown between core and casual gamers, there are a few key differences in how they spend on extra content.Core gamers are almost twice as likely to spend 20 or more USD/GBP on DLC and casual gamers are slightly more likely to spend in the sub-20 USD/GBP band for their extra content.Tip 💡 Keep fresh content and experiences rolling into your game updates to increase the longevity of your game and keep your players coming back for more. They’re willing to pay if you’re willing to provide.Casual gamers aren’t relegated to specific genres like card and puzzle games. Aside from FPS, fighting, and sports – where more core gamers report having played a title within the last week – there’s a relatively even split of core and casual gamers enjoying games in every genre.In fact, on average, across all genres, there’s a less than 10 percentage point difference between core and casual gamers who have played a title within the genre in the last week.For example, 51% of core gamers played a role-playing game in the last week, versus 46% of casual gamers.As mentioned, FPS, fighting, and sports titles have the biggest difference in popularity with core and casual audiences, with a 23, 16, and 16 percent difference between the audiences respectively.Card games have the most even split of casual versus core gamers (43% vs 44%), with racing, simulation, puzzle, and RPGs tied for second place.Tip 💡 Don’t let your planned genre box you into thinking you’ll only reach a certain audience. Knowing that there are people out there for all kinds of games, across core and casual gamers, might just open up the opportunity for you to build your dream game for an audience you hadn’t yet considered.Dive deeper into the full findings of our 2022 multiplayer report by downloading the PDF.Starting on a new multiplayer game project? Get everything you need to create your game, connect your players, and empower communication from our Multiplayer suite of products.Unity is looking to equip studios of all sizes with both the knowledge and tools needed to deliver on these experiences for years to come. Get involved with us and the community by heading over to our Unity Gaming Services forum.

>access_file_
1111|blog.unity.com

5 common lightmapping problems and tips to help you fix them

I recently developed a guide to Progressive Lightmapper troubleshooting in order to help developers get the most out of Baked Global Illumination (GI) in the Unity Editor. Here, I unpack five of the most common lightmapping problems and their solutions, supported by images and links to pages in the Unity Manual. For the full guide, visit the forums.If certain prerequisites are not met, the Progressive Lightmapper might fail to generate lighting in your scene. These conditions include, but are not limited to:No objects marked as GI ContributorsNo baked lights in the sceneShader issuesTo correct this, I recommend trying one of the fixes laid out below.Mark objects you want to lightmap as GI Contributors by following these steps:Select your GameObject.Navigate to the Mesh Renderer component.Unfold the Lighting header.Check the Contribute to Global Illumination checkbox.Doing so will enable the Receive Global Illumination parameter underneath. It contains two options:Lightmaps: Meant for static lightmapped objects – GameObject will receive and contribute GI to lightmaps.Light Probes: Meant for small props and objects not fit for lightmapping – GameObject will receive GI from Light Probes and will contribute GI to the surrounding lightmaps.Only Mixed and Baked lights can contribute to Baked GI. Select lights in your scene, then set the Mode to either Mixed or Baked in the Light component. Other properties that are worth checking include:Color: Dark colors will have low or no GI contribution. Choose bright colors for lights and use the Intensity property to boost or dim them.Intensity: The higher the intensity, the brighter the light. Ensure that your lights are bright enough for meaningful GI contribution.Indirect Multiplier: This property controls the intensity of the indirect bounce. Make sure that it is not set to zero. Otherwise, the light will have no contribution to GI at all. Note that setting this value above one will make the lighting in your scene non-compliant to the physically based rendering (PBR) standard.Visit No Baked Global Illumination in the scene forum In the Lighting window (accessible via Window > Rendering > Lighting), make sure that the Lighting Settings Asset field is not left blank. If there is no asset assigned, click on the New Lighting Settings button. This will create and assign an asset, unlocking the properties in the window for editing.Once this is complete, verify that:You have ticked the Baked Global Illumination checkbox, which enables Baked GI computations. This checkbox will also expose the Lighting Mode drop-down list.The Max Bounces value is not set to zero. The higher this value, the more the light will bounce around the environment.The Indirect Intensity slider is not set to zero. Setting this slider to zero will diminish all indirect lighting in the scene.Custom shaders could be the reason why GI computation has failed. For debug purposes, use the built-in shaders that come with the Unity Editor. Those are:Standard Shader: Available in the Built-in Render PipelineLit Shader: Available in the Universal Render Pipeline (URP)Lit Shader: Available in the High Definition Render Pipeline (HDRP)If Unity generates lighting after switching to one of the shaders outlined above, the problem might be with the custom shaders. In this case, make sure that the surface shaders contain the LIGHTMAP_ON shader keyword.Check out the Meta Pass page for details on how to further customize the Baked GI output using shaders.Other potential fixesIf the above steps have not solved your problem, consider trying these potential fixes:Select a different lightmapping backend in the Lighting window. If lighting fails to bake when using the Progressive GPU, but succeeds when baking with the Progressive CPU, this might be the result of a hardware or driver problem.Update the GPU drivers. Please refer to the GPU manufacturer’s page for the correct drivers for your system. (For Linux machines, check the Linux driver setup section in this forum thread).Ensure that your GPU meets the minimum requirements. Please refer to this forum thread.Clear the GI Cache. To clear it, navigate to Preferences > GI Cache and click on the Clean Cache button. Keep in mind that this will delete all lighting data present in the scene, requiring you to regenerate lighting.Certain objects that appear unlit or out of place might indicate a problem with the scene setup, which often reproduces when dynamic objects have no Light Probes to sample the lighting from. Furthermore, any glossy metallic material in the scene might appear as black if no local Reflection Probes are present.To correct this, I recommend trying one of the fixes laid out below.Dynamic objects – or GI Contributors receiving GI from Light Probes – need Light Probes to sample indirect lighting data. If none are present, objects will fall back to sampling the Ambient Probe (i.e., the Light and Reflection Probe that is always present in the scene).To mitigate this, set up a Light Probe network in the scene, adding more probes in areas of high importance. Make sure that there are enough Light Probes to encompass all affected objects, and generate lighting again to see the effect.Reflective metallic objects might still render as black, even after placing a dense network of Light Probes. To shade such objects, you need to place a Reflection Probe that encompasses the affected object. Generate the lighting again or re-bake the probe in the Reflection Probe Component by clicking the Bake button.If you observe black areas in the reflections, try increasing the Bounces count. This will increase the number of bounces, thus creating reflections within reflections. You can access this property in Lighting > Environment > Environment Lighting.Visit Objects are missing lighting forumIf performing the previous steps still does not solve the issue, inspect the Mesh Renderer component of the affected object. Under the Probes section, make sure that the Light Probes and Reflection Properties are set to anything other than Off.Pure black materials will absorb all direct and indirect light. This is physically correct behavior. In real life, no naturally occurring material is completely black. For example, one of the darkest natural materials, coal, measures at “50, 50, 50” on an RGB luminosity scale.Adjust your material color values to follow the physically based shading standards. In the Built-in Render Pipeline, you can use the Validate Albedo Scene View Draw Mode to determine whether Albedo values are PBR-compliant. You can use the Rendering Debugger in URP and HDRP to do the same.If you are working with multiple scenes, check that the scene containing lighting is set as the Active Scene. By default, Unity sets the first loaded scene as the Active Scene, which might have a detrimental effect in the standalone player builds.There are two types of issues related to emissive material rendering:Emissive materials do not appear as “glowing,” which indicates a post-processing issue.Emissive materials are not contributing to Global Illumination, which indicates an issue with object or material setup.To correct either of these issues, I recommend trying one of the fixes laid out below.To create the impression of a glowing material, enable Bloom in your post-processing stack of choice. Refer to Built-in RP, URP, or HDRP documentation for tips on how to do this.If you intend on using emissive objects for lightmapping, make sure that:You have marked the GameObject in question as a GI Contributor. Due to the self-illuminating nature of emissive objects, you can set their Receive Global Illumination property to Light Probes. This will save space in the lightmap atlas.The Global Illumination property is set to Baked in the Material Inspector. This property is available under the Emission input. Refer to Built-in RP, URP, or HDRP documentation for more details.In the Lighting window, make sure that the Indirect Intensity property is not set to zero. Setting it to zero will disable all indirect lighting, including baked contribution from baked emissive objects.Visit Emissive materials not rendering forumWhen baking in Non-directional mode, the Unity Editor will not create a separate texture to hold directionality information. This will result in objects looking flat after baking.It is worth noting that low frequency normal maps are hard to capture using directionality textures. Such textures will appear flat when generating lighting using fully baked lights.To correct this, I recommend trying one of the fixes laid out below.In the Lighting window, set the Directional Mode property to Directional. This mode will generate a secondary texture that will store dominant light direction. Normal maps will have a good representation of relief, but will lack specular response.Visit Flat normal maps ForumMixed lights provide real-time specular and normal response. The Progressive Lightmapper bakes indirect lighting into a lightmap. This combination ensures the highest quality material response when using baked lighting.If your project allows for it, switch the light Mode to Mixed in the Light component. Note that Mixed lights have the same performance cost as real-time lights. Depending on the Lighting Mode used, Mixed lights will cast real-time shadows but not baked soft shadows.Probe-lit GameObjects will often have a better material response than those lit by Baked lights. If your art direction allows for it, set their Receive Global Illumination property to Light Probes in the Mesh Renderer component. Note that you can also use Light Probe Proxy Volume (LPPV) to add a spatial gradient to probe-lit objects.One of the inherent limitations of Baked lights is that they do not provide real-time specular response to materials. This means that glossy materials will lack specular highlights after generating lighting.To correct this, I recommend trying one of the fixes laid out below.Unlike Baked lights, Mixed lights provide real-time direct specular response to materials. If specular highlights are important in your scene, switch the light Mode to Mixed in the Light component.Visit Missing specular response forumIt is possible to imitate specular response from lights by using emissive objects. To do so, follow these steps:Place a Reflection Probe in your scene.Right-click in the Hierarchy panel and select 3D Object > Sphere. Select the newly created object and set its Static Editor Flag to Reflection Probe Static.In the Project panel, create a new material by right-clicking and selecting Create > Material.Select the newly created material and enable the Emission checkbox. Set the Global Illumination property to None.Drag and drop the material onto the sphere to assign it.Place the sphere in the same position as your light.Generate lighting.After following the above steps, you should be able to see the emissive objects captured in the Reflection Probe cubemap. You can hide those objects after baking or set up a Culling Mask in the Camera component.For more tips on troubleshooting the Progressive Lightmapper, check out the full guide in the forums. If you’d like to discuss this article or share other solutions, feel free to connect with me there or here. Finally, be sure to watch for new technical blogs from other Unity developers as part of the ongoing Tech from the Trenches series.

>access_file_
1112|blog.unity.com

Get your game on at Unite 2022

Are you ready? Unite is back and free for everyone this fall – and that means you.It’s all happening on Tuesday, November 1. Whether you join in person or online, the event will be a chance to celebrate accomplishments alongside fellow game developers and learn from the Unity community. Read on for more information about the event (and to snag your spot by registering).No matter where you’re located, this year’s Unite has an option for you to participate. Join the event in person – in Austin, Brighton, Copenhagen, Montreal, or San Francisco – or virtually from anywhere in the world. As an attendee, you’ll have access to exciting education and a full schedule of unique experiences.By registering for Unite 2022, you’ll take your Unity network and knowledge to the next level. In addition to incredible relationship-building opportunities, some top reasons to participate include:20+ expert-led, outcome-based breakout sessions from developers – Discover technical deep dives on new features, product roadmaps, performance tips, code structures, advice from studios for better workflows, and more.A keynote packed with product updates and creator stories – Join Unity leaders and respected developers to hear inspiring stories, watch exciting tech demos, and find out how Unity is evolving to meet your needs.A unique virtual concert, made with Unity – The excitement amps up with a top-notch, live musical performance in an interactive world ready for you to explore. Come take a look around, mingle, and join the party – you won’t want to miss it.View the full agenda online.We can’t wait to connect with our community both in person and virtually at Unite 2022. If you want to talk to us in the meantime, we’re always available in the forums, or you can join the conversation on Twitter and Instagram. Plus, don’t forget to register for the Unity for Humanity Summit on November 2.

>access_file_
1113|blog.unity.com

Programmatic video advertising: the complete guide

Programmatic video advertising is a complex, multi-layered topic. Multiple networks serving different audiences — advertisers on the demand side and publishers on the supply side — work together with third-party mediators to serve video ads to billions of end users in the blink of an eye.It’s also a multi-billion dollar industry under constant, rapid evolution, and understanding it is key to reaching your audience and maximizing ROAS. Wherever you are on the demand or supply-side of the mobile ad stack, you should be aware of how the programmatic ad service process works and where things are headed.Let’s dive into programmatic video advertising and go over what it is, identify several key trends for the future, and break down how to choose a programmatic video platform that’s right for you.In this post, we’ll cover the following:What is programmatic video advertising?A timeline of growth for programmatic video advertisingProgrammatic video trends to watchHow to choose a programmatic video advertising platformMake programmatic video easyWhat is programmatic video advertising?For years, direct media buying — where advertisers made direct deals with publishers to determine exactly where their ads would run — ruled the landscape. And while direct media buying still works for static media, the advent of interactivity through the internet and smartphones allows for users to be directly catered to based on their individual interests. That’s where programmatic video advertising comes in.But what’s a good programmatic video definition? Essentially, as ads are served to users in mobile games, through connected TV experiences like Roku, or social media platforms, they are bid on in real-time by ad networks. The value of the bids is determined by characteristics like user demographics, what game they’re playing or app they’re using, and the bid ceiling set by the advertiser. The winner of the bid has their video served to the user.What makes programmatic video advertising so mind-blowing is that all of these bids are happening automatically in the fractions of a second between the moment an ad is called (like when a user taps on a button to double their reward in a game) and when it’s served. Market forces determine the cost of an ad, and data like user acquisition and ROAS can be monitored in real-time. And while the whole market cycle can seem intimidating and highly-technical, platforms like the ironSource Exchange make it easier than ever to integrate and display ads.To learn more about the benefits of programmatic video advertising and the channels best suited for it, read our guide, What Is Programmatic Video Advertising?A timeline of growth for programmatic video advertisingThe programmatic video ad market has seen an explosion of growth over the last few years, and it shows no signs of slowing down.In 2019, surveys showed that programmatic video ad spending grew to more than $24B. Only two years later, that number shot up to $52B according to a 2021 report by eMarketer, with the potential for ad spend reaching $62.96B in 2022. Mobile ad spend accounts for two-thirds of all programmatic video spending, with connected TV following behind and steadily growing.And that’s just in the United States. Global programmatic video ad spending reached $112.9B in 2019, eventually hitting $155B in 2021 and accounting for 34% of all global programmatic ad spend.As smart device and mobile game adoption grows, so too will programmatic video grow to meet the demand.Read more about programmatic video growth and learn about the top programmatic advertising video players in Your Guide to the 2022 Programmatic Video Advertising Market.Programmatic video trends to watchMobile advertising technology moves fast, and it’s important to have a sense of where things are headed so you can stay a step ahead of the competition.In terms of ad content, the rise of TikTok as a social media platform has caused a shift in the kinds of ads companies are creating for the programmatic video marketplace. Ads themselves are starting to look like TikToks, often being repurposed from TikTok itself to suit the more specific needs of mobile games and other apps. Videos are shorter, filled with dances or direct conversations, and often embrace subtitles or other text alternatives like written signs to present information.Privacy moves made by Google and Apple have made waves throughout the industry, as the removal of third-party cookies and updated app tracking policies are forcing mobile ad companies to get creative with data solutions. Advertisers are looking to first-party data to replace information third-party cookies would normally provide, and are implementing solutions like rewarded ads and the offerwall to paint a better picture of user data. It’s also leading to a wider move toward consolidation, as agencies look across the demand- and supply-side stack for opportunities to shore up weaknesses and strengthen their offerings.How to choose a programmatic video advertising platformProgrammatic video advertising platforms integrate with the demand side and supply side of the mobile advertising stack, and determining which platform is best for you will depend on which side of the stack you’re on.Advertisers looking for the best way to increase their return on ad spend (ROAS) will want to find a platform that offers the following:A large audience of engaged usersFull-screen video offerings that won’t go unnoticedEasy testing for a variety of creative and placement optionsSDK integration that provides access to thousands of connected appsIf you’re an app publisher looking to maximize user engagement and revenue, you’ll want to find a platform that can:Connect your app with a large network of advertisersProvide rewarded videos and offerwallsEasily integrate via industry-standard SDKsProvide support for programmatic mediation so you can ensure impartiality on ad bidsCheck out our guide on How to Choose a Programmatic Video Advertising Platform: 8 Consideration for more details on how to pick a platform that suits your needs.Make programmatic video easyThere are a lot of moving parts within the programmatic video advertisement world, and it can seem like a lot to take in. You don’t have to sweat the hard stuff, though. The ironSource Programmatic Marketplace makes putting your ads in front of billions of engaged users in brand-safe environments with complete transparency easier than ever. Contact us today to learn more.

>access_file_
1114|blog.unity.com

Unity Pro and Unity Enterprise plans: New pricing coming soon

Starting October 13, we’re changing our pricing for Unity Pro, Unity Enterprise, and Unity Industrial Collection, our first price increase in almost three years. This announcement will not affect Unity Personal or Unity Plus pricing.Our per-seat subscription pricing will be adjusted as follows:Unity Pro annual prepaid pricing will be $2,040/yr, and annual monthly paid plans will be $185/mo.Unity Enterprise annual prepaid pricing will be $3,000/yr.Unity Industrial Collection (UIC) annual prepaid pricing will be $2,950/yr.Note: Unity Personal and Unity Plus pricing are not affected by this announcement.The new pricing goes into effect on October 13, 2022 for Unity Pro, Unity Enterprise, and UIC. Online customers can visit their account before this date to renew or switch to an annual plan for greater savings. Customers with account managers should reach out to them directly to discuss the options based on their custom agreements. Find more details in the FAQs below.If you have further questions, contact Unity support or talk to your account manager.Who is affected by the new pricing?The new pricing applies to both new and existing Unity Pro, Unity Enterprise, and Unity Industrial Collection (UIC) customers on monthly or annual plans. Unity Personal and Unity Plus customers are not affected.When does the new pricing take effect?For new Unity Pro, Unity Enterprise, and UIC customers, the new pricing is effective on October 13, 2022, 13:00 UTC.For existing Unity Pro, Unity Enterprise, and UIC customers, the new pricing goes into effect on October 13, 2022. Online customers can visit their account before this date to renew or switch to an annual plan for greater savings. Customers with account managers should reach out to them directly to discuss the options based on their custom agreements.Which plans are included in the price change?Prices are changing for Unity Pro, Unity Enterprise, UIC, and bundles containing these plans, as well as Starter Success, Build License, Pro Build Server, and Enterprise Build Server.Unity Personal and Unity Plus plans are not included in the price change.Each seat of Unity Pro will be $2,040/year, Unity Enterprise will be $3,000/year, and UIC will be $2,950/year for annual prepaid plans. We offer other options such as monthly payment for certain plans.Per seat pricingUnity Plan Current List Price New Price on October 13, 2022 Unity Pro, prepaid yearly $1,800/year $2,040/year Unity Pro, paid monthly $150/month $185/month Unity Enterprise $2,400/year $3,000/year Unity Industrial Collection $2,520/year $2,950/yearFor more details, please see the extended FAQ.What other changes are being made to the plans?The changes below will take effect starting on October 13, 2022. For more details, please see the extended FAQ.Unity Pro – New and existing plans will include Unity Mars, as well as Havok Physics for Unity with the release of Unity 2022.2 this fall.Unity Enterprise – New and existing plans will receive an additional year of Long Term Support (from two years to three years) starting with 2021 LTS.New and existing customers or those upgrading from Unity Pro to Unity Enterprise will receive access to read-only source code on request.Enterprise plans will also include new support offerings:1–19 seat accounts receive Starter Success, a technical support package to help you overcome issues with assistance from Unity engineers.20+ seat accounts get a Partner Relations Manager, an internal advocate and strategic advisor to accelerate projects.100+ seat customers will receive Bug Handling and LTS backporting at no additional charge.This plan will also include Unity Mars, as well as Havok Physics for Unity with the release of Unity 2022.2 this fall.Unity Industrial Collection – Starter Success is an entry-level technical support package that helps you overcome issues with assistance from Unity engineers. New customers or those upgrading from Unity Pro to UIC will receive access to Starter Success. Existing UIC customers who have a contract expiry date after October 13, 2022 will receive Starter Success upon renewal.This plan will also include Unity Mars, as well as Havok Physics for Unity with the release of Unity 2022.2 this fall.We are updating our Terms of Service for all Unity subscription plans, effective October 13, 2022, to create a more streamlined, user-friendly set of terms. Please review them here.Why is the price of my Unity plan changing?The new price reflects the value of our products today, and it’s our first increase in almost three years.In that time, we’ve expanded our R&D resources by 172%, with the Unity Editor being our largest focus for R&D investment at Unity. This continued investment has helped us deliver Unity 2021 LTS with powerful improvements to workflows, rendering capabilities, and supported platforms.We will continue delivering improvements based on your feedback in every release to enhance your productivity, performance and stability.What is happening to the free Unity Personal plan?Unity Personal will remain free to creators with revenue or funding (raised or self-funded) below USD $100K in the past year. We are committed to ensuring Unity continues to be accessible to the millions of students, hobbyists, and indies starting out on their game development journeys.I still have more questions!Don’t see your question here? Get more details on Unity Pro, Unity Enterprise, Unity Industrial Collection, Unity Mars, Havok Physics for Unity, and how this change will impact your specific payment plan in our extended FAQ.

>access_file_
1116|blog.unity.com

Clean up your code: How to create your own C# code style

While there’s more than one way to format Unity C# code, agreeing on a consistent code style for your project enables your team to develop a clean, readable, and scalable codebase. In this blog, we provide some guidelines and examples you can use to develop and maintain your own code style guide.Please note that these are only recommendations based on those provided by Microsoft. This is your chance to get inspired and decide what works best for your team.Ideally, a Unity project should feel like it’s been developed by a single author, no matter how many developers actually work on it. A style guide can help unify your approach for creating a more cohesive codebase.It’s a good idea to follow industry standards wherever possible and browse through existing style guides as a starting point for creating your own. In partnership with internal and external Unity experts, we released a new e-book, Create a C# style guide: Write cleaner code that scales for inspiration, based on Microsoft’s comprehensive C# style.The Google C# style guide is another great resource for defining guidelines around naming, formatting, and commenting conventions. Again, there is no right or wrong method, but we chose to follow Microsoft standards for our own guide.Our e-book, along with an example C# file, are available for free. Both resources focus on the most common coding conventions you’ll encounter while developing in Unity. These are all, essentially, a subset of the Microsoft Framework Design guidelines, which include an extensive number of best practices beyond what we cover in this post.We recommend customizing the guidelines provided in our style guide to suit your team’s preferences. These preferences should be prioritized over our suggestions and the Microsoft Framework Design guidelines if they’re in conflict.The development of a style guide requires an upfront investment but will pay dividends later. For example, managing a single set of standards can reduce the time developers spend on ramping up if they move onto another project.Of course, consistency is key. If you follow these suggestions and need to modify your style guide in the future, a few find-and-replace operations can quickly migrate your codebase.Concentrate on creating a pragmatic style guide that fits your needs by covering the majority of day-to-day use cases. Don’t overengineer it by attempting to account for every single edge case from the start. The guide will evolve organically over time as your team iterates on it from project to project.Most style guides include basic formatting rules. Meanwhile, specific naming conventions, policy on use of namespaces, and strategies for classes are somewhat abstract areas that can be refined over time.Let’s look at some common formatting and naming conventions you might consider for your style guide.The two common indentation styles in C# are the Allman style, which places the opening curly braces on a new line (also known as the BSD style from BSD Unix), and the K&R style, or “one true brace style,” which keeps the opening brace on the same line as the previous header.In an effort to improve readability, we picked the Allman style for our guide, based on the Microsoft Framework Design guidelines: Whatever style you choose, ensure that every programmer on your team follows it.A guide should also indicate whether braces from nested multiline statements should be included. While removing braces in the following example won’t throw an error, it can be confusing to read. That’s why our guide recommends applying braces for clarity, even if they are optional.Something as simple as horizontal spacing can enhance your code’s appearance onscreen. While your personal formatting preferences can vary, here are a few recommendations from our style guide to improve overall readability:Add spaces to decrease code density:The extra whitespace can give a sense of visual separation between parts of a lineUse a single space after a comma, between function arguments.Don’t add a space after the parenthesis and function arguments.Don’t use spaces between a function name and parenthesis.Avoid spaces inside brackets.Use a single space before flow control conditions: Add a space between the flow comparison operator and the parentheses.Use a single space before and after comparison operators.Variables typically represent a state, so try to attribute clear and descriptive nouns to their names. You can then prefix booleans with a verbfor variables that must indicate a true or false value. Often they are the answer to a question such as, is the player running? Is the game over? Prefix them with a verb to clarify their meaning. This is often paired with a description or condition, e.g., isPlayerDead, isWalking, hasDamageMultiplier, etc.Since methods perform actions, a good rule of thumb is to start their names with a verb and add context as needed, e.g., GetDirection, FindTarget, and so on, based on the return type. If the method has a bool return type, it can also be framed as a question.Much like boolean variables themselves, prefix methods with a verb if they return a true-false condition. This phrases them in the form of a question, e.g., IsGameOver, HasStartedTurn.Several conventions exist for naming events and event handles. In our style guide, we name the event with a verb phrase,similar to a method. Choose a name that communicates the state change accurately.Use the present or past participle to indicate events “before” or “after.” For instance, specify OpeningDoor for an event before opening a door and DoorOpened for an event afterward.We also recommend that you don’t abbreviate names. While saving a few characters can feel like a productivity gain in the short term, what is obvious to you now might not be in a year’s time to another teammate. Your variable names should reveal their intent and be easy to pronounce. Single letter variables are fine for loops and math expressions, but otherwise, you should avoid abbreviations. Clarity is more important than any time saved from omitting a few vowels.At the same time, use one variable declaration per line; it’s less compact, but also less error prone and enhances readability. Avoid redundant names. If your class is called Player, you don’t need to create member variables called PlayerScore or PlayerTarget. Trim them down to Score or Target.In addition, avoid too many prefixes or special encoding.A practice highlighted in our guide is to prefix private member variables with an underscore (_) to differentiate them from local variables. Some style guides use prefixes for private member variables (m_), constants (k_), or static variables (s_), so the name reveals more about the variable.However, it’s good practice to prefix interface names with a capital “I” and follow this with an adjective that describes the functionality. You can even prefix the event raising method (in the subject) with “On”: The subject that invokes the event usually does so from a method prefixed with “On,” e.g., OnOpeningDoor or OnDoorOpened.Camel case and Pascal case are common standards in use, compared to Snake or Kebab case, or Hungarian notations. Our guide recommends Pascal case for public fields, enums, classes, and methods, and Camel case for private variables, as this is common practice in Unity.There are many additional rules to consider outside of what’s covered here. The example guide and our new e-book, Create a C# style guide: Write cleaner code that scales, provide many more tips for better organization.The concept of clean code aims to make development more scalable by conforming to a set of production standards. A style guide should remove most of the guesswork developers would otherwise have regarding the conventions they should follow. Ultimately, this guide should help your team establish a consensus around your codebase to grow your project into a commercial-scale production.Just how comprehensive your style guide should be depends on your situation. It’s up to your team to decide if they want their guide to set rules for more abstract, intangible concepts. This could include rules for using namespaces, breaking down classes, or implementing directives like the #region directive (or not). While #region can help you collapse and hide sections of code in C# files, making large files more manageable, it’s also an example of something that many developers consider to be code smells or anti-patterns. Therefore, you might want to avoid setting strict standards for these aspects of code styling. Not everything needs to be outlined in the guide – sometimes it’s enough to simply discuss and make decisions as a team.When we talked to the experts who helped create our guide, their main piece of advice was code readability above all else. Here are some pointers on how to achieve that:Use fewer arguments: Arguments can increase the complexity of your method. By reducing their number, you make methods easier to read and test.Avoid excessive overloading: You can generate an endless permutation of method overloads. Select the few that reflect how you’ll call the method, and then implement those. If you do overload a method, prevent confusion by making sure that each method signature has a distinct number of arguments.Avoid side effects: A method only needs to do what its name advertises. Avoid modifying anything outside of its scope. Pass in arguments by value instead of reference when possible. So when sending back results via the out or ref keyword, verify that’s the one thing you intend the method to accomplish. Though side effects are useful for certain tasks, they can lead to unintended consequences. Write a method without side effects to cut down on unexpected behavior.We hope that this blog helps you kick off the development of your own style guide. Learn more from our example C# file and brand new e-book where you can review our suggested rules and customize them to your team’s preferences.The specifics of individual rules are less important than having everyone agree to follow them consistently. When in doubt, rely on your team’s own evolving guide to settle any style disagreements. After all, this is a group effort.

>access_file_
1117|blog.unity.com

Made with Unity Monthly: August 2022 roundup

With so many interesting and varied projects being shared with us at any given moment, whether via direct tags or the #MadeWithUnity and #UnityTips hashtags, it can be hard to keep up with all that’s happening within our community. Don’t worry, we’ve got you covered. Sit back, relax, and enjoy this roundup of highlights that showcase everything you need to know about what the community got up to last month.Each Monday, we celebrate a milestone hit by one of our creators. From new game launches to awards won, we are constantly in awe at your achievements!Throughout August we celebrated the launch of Lost in Play by Happy Juice Games, the launch of As Dusk Falls by INTERIOR/NIGHT, and Dot's Home by Rise-Home Stories Project for winning big at the Games for Change Awards!Tuesdays are dedicated to #UnityTips on Twitter, and last month Massive Monster, makers of Cult of the Lamb, took over @UnityGames to share tips on how to make your 2D Sprites work harder (1,2,3,4,5).Some other great tips that were shared include Ehsan Ehrari’s quick and easy way to neaten up your workflow if your layers end up getting messy and Sunny Valley Studio’s tip on how to fix holes in your shadows.We share tips on our Twitter channels every Tuesday, so keep tagging us and using the #UnityTips hashtag!We love seeing your works in progress, and each Friday we share the projects that have us searching for a playable demo.August was no exception! Rollerdrome by Roll 7 had us mesmerized with a peek under the hood at the prototype to final game transition, Grimm Tales made a pool of water look so refreshing that all we could think about was going for a swim, and François Martineau sent our imagination racing about what other great foes we’d encounter on our hero’s journey.Each time we browse through the #MadeWithUnity hashtag across social media, we’re always amazed at the creativity and love you bring to your projects. Keep sharing!Last month, the Unity YouTube saw a flurry of activity with two new entries into our Meet the Creator series and a sizzle reel for gamescom 2022.Want more? We have you covered with two amazing YouTube playlists you need to keep your eye on.We love #MadeWithUnityMade with Unity Game TrailersLast month on Twitch we:Sat down with INTERIOR/NIGHT, the team behind As Dusk Falls, to gain insights into their development: unique character pipelines that blend 2D and 3D, the use of Unity Timeline, their companion app, and more.Skated into the Rollerdrome with Roll_7 to look at level design, shaders, physics, and more.Spoke to Thomas Brush, creator of Father, who shared his approach to pitching to publishers, crowdfunding, and demoing.Don’t forget to follow us on Twitch and hit the notification bell so you never miss a stream. If you miss us live no sweat, we upload full streams to our YouTube playlist.We kicked off August with 40% off select Asset Store tools and packages to spark your imagination. Then, Daniel Zeller took over the @AssetStore Twitter to share tips on the Fluffy Grooming Tool, and so did Renaud Forestié from More Mountains, who offered insights on how you can use FEEL to level up your game.For even more on assets, see how the smaller teams behind popular game Sable and indie mobile business Tinytouchtales have used Asset Store resources to speed up game development and save time.Each Friday, we also feature beautiful assets made by our publisher community. Whether it’s something new, or a popular tool that has been around for a while, we love to see the assets in action. Check out some of what was shared in August below!InfiniGRASS by ArtnGameGPU Grass by Dan PhilipsScreen Ripple by Steven GerrardElemental Animations by Kevin IglesiasDo you want to show off your work? Make sure to use the #AssetStore hashtag!And finally, here’s a non-exhaustive list of Made with Unity games released in August. See any on the list that have already become new favorites?Big Ambitions, Hovgaard Games (August 1)The Mortuary Assistant, DarkStone Digital (August 2)Two Point Campus, Two Point Studios (August 9)Stereo Boy, Main Gauche Games (August 9)Farthest Frontier, Crate Entertainment (August 9)Lost in Play, Happy Juice Games (August 10)Arcade Paradise, Nosebleed Interactive (August 11)Cult of the Lamb, Massive Monster (August 11)Rollerdrome, Roll7 (August 16)Cursed to Golf, Chuhai Labs (August 18)Midnight Fight Express, Jacob Dzwinel (August 23)I Was a Teenage Exocolonist, Northway Games (August 25)ORX, johnbell (August 30)Immortality, Sam BarlowHalf Mermaid (August 30)Tinykin, Splashteam (August 30)That’s a wrap for August! Want more as it happens? Don’t forget to follow us on social media.

>access_file_
1119|blog.unity.com

App Analytics: 3 ways to optimize performance by combining player and growth data

Optimizing monetization and UA is key for growth - but they only get you so far if the app itself isn't optimized. In fact, the best way to maximize growth is by tying monetization and user acquisition data to user behavior. The more granular you can get, the more you can do to strengthen your business.Using ironSource's App Analytics solution to break down the data, here are 3 ways you can better optimize for growth.#1: Reconciling impressions vs playtimeThere’s typically a tradeoff between the number of impressions you serve users versus its effect on user engagement (playtime). That’s because game designers are determined to improve the user experience in the game, while monetization managers are focused on maximizing the revenue per user. The key to reconciling these pain points is to deep dive into each metric to find the sweet spot that maximizes both.For example, you can compare playtime versus impressions per user, side by side. Let’s say you want to increase impressions from a cap of 3 to 5. The first step is to assess whether this affects playtime, and make the best adjustments to ensure a positive user experience. If you see a significant drop in playtime, it’s a sign that users weren’t happy, and a lesson to return to the original impressions cap. However, if increasing the cap has no effect on playtime - you can generate revenue without affecting user experience.Now your data-informed conclusions can improve both user experience and profits. It’s not always easy to find the optimal point between them, but looking at them side by side with a tool like App Analytics can help to reconcile multiple interests.#2: Improving user engagement with cohortsTraditionally, retention analysis focuses on user churn. Going one step further - not just looking at if users return, but also understanding their behavior when they return - paints a much clearer picture about how to better retain users and find more places to monetize them.Assessing how users playTo encourage your players to keep going, first understand what’s already motivating them in the first place. If you have in-app purchase API data, simply split paying versus non-paying users and see how they compare in different metrics (playtime, impressions per user, sessions, session length, etc.). Now you can better understand your paying users and how they differ from non-paying users - for example, if you see that session length is higher for paying or non-paying users, you can use this information to your advantage.Let’s say App Analytics shows there was a spike in your revenue on a cohorted day. You also see that your users were running out of the initial resources from their starter pack and hadn’t utilized ways to earn more, like clans and special events. Your profits spiked because users only had one choice - to keep playing, they had to use in-app purchases to gain more resources.Offering just the right reward amountTo determine a reward amount that continuously engages users, you need to first understand your users. Start by looking at impressions per user. The figure should be fairly consistent, especially if you’re capping impressions, but it’s still important to monitor this number for any technical issues.Let’s say you see a drop in impressions on day 5 on the App Analytics platform. You can dive deeper to understand why - it turns out that on day 5, users received a special daily bonus with a lot of gems. However, they acquired so much currency, they didn’t need to watch a rewarded video to move forward - which is why impressions were low that day.The best practice would be to decrease the reward on day 5, so that users stay in the sweet spot: engaged in the game but needing extra gems to progress.Maximizing your strengthsTo increase your user retention, you want to maximize exposure to events that engage users, like clans. To do that, look for areas in the game with increased playtime and determine whether it’s connected to an event. For example, if you see increased playtime on day 8 and know that users usually unlock the clan on day 8, you can infer that’s why you saw a spike. Great! Let’s put our focus here.Try moving the clan date earlier in the user experience. If you can identify key engagement moments in the game and offer those experiences before the typical drop in retention, you have a better chance at retaining users for a longer period. You can track the success of this decision on the App Analytics cohorts page by using both the retention and playtime metric for users who were exposed to this change. #3: Determine your milestones with the funnel pageTo boost user playtime and revenue, we need to optimize how users move from one milestone in the game to another. This is where funnels come in: with a custom events API on App Analytics, you can create custom funnels and immediately see how users progress through your milestones, or fall in the pitfalls between them. For example, you can create a level funnel that shows how many users started level 1, then progressed to level 2, and so on.Optimizing your in-app purchase flowTo best understand your user journey in action, define exactly what you’re interested in observing and create clear funnels based on that path. For example, if you’re looking to convert more users into paying users, you might want to understand user purchase patterns. You can create funnels that filter users who make a specific purchase, and follow whether users continue making purchases after that. Informed by your new funnels, you can start using trial and error to adjust your in-app purchases’ rewards or placements and increase your conversion rate.Optimizing engagement (level drop)The majority of users drop off in the first days of playing. So to improve user engagement, it’s essential to monitor the onboarding process. Using funnels, define each of your onboarding milestones and order them chronologically - for example, “registration,” “privacy policy,” “select avatar,” and more. This way, you can view the dropoff rate between them.These insights can inform exactly the tweaks your game needs to improve retention. We recommend you test your changes with A/B testing. In fact, you can dive even deeper. If you created a “select avatar” milestone, you can focus on specific avatars to gain more insight. If the conversion rate is much higher for players who chose a specific avatar, you can use this information to update your game accordingly.You can also assess behavior by level. For example, if suddenly on level 3 there’s a major drop in conversion rate, it’s an indication to dig deeper and understand the cause. Create a level funnel to understand how conversion changes incrementally. Try to find the origin - did this drop happen before level 3, or during it? All of this information will bring you closer to making the adjustments needed to improve engagement, boosting game performance.With any app, understanding user engagement is the key to understanding how to scale up your business. App Analytics ensures this process is easy and intuitive by providing different pages (explore, cohorts, and funnels) to zoom in on user behavior and understand what’s affecting it. Learn more about App Analytics here.

>access_file_