// transmission.log

Data Feed

> Intercepted signals from across the network — tech, engineering, and dispatches from the void.

1695 transmissions indexed — page 38 of 85

[ 2024 ]

20 entries
741|blog.unity.com

Break into real-time 3D industries with Unity’s Elevate program

There’s never been a better time to pursue a career in real-time 3D (RT3D). Across industries as diverse as gaming, architecture, automotive, medicine, aerospace, and film, new and lucrative career opportunities are emerging for the next generation of real-time developers, programmers, and artists.However, despite the RT3D boom, preparing for these new roles and navigating a hiring environment that’s rapidly shifting can be challenging for everyone involved. How are job seekers to know which skills they’ll need for which positions, and how best to get themselves up to speed? How should educators adapt their teaching so it stays relevant? And, how can employers more easily identify and onboard the very best candidates for increasingly specific roles?Unity’s latest initiative, Elevate, is here to address these challenges by transforming the way job seekers, educators, and employers interact within the dynamic RT3D landscape. Elevate’s mission is to equip job seekers with the knowledge they need to prepare for and secure opportunities in RT3D sectors, while also providing educators and employers with resources to streamline the talent cultivation and hiring cycle.Introducing Universal Job ProfilesAt the core of the Elevate initiative are Universal Job Profiles (UJPs) – meticulously crafted guidebooks designed to eliminate the guesswork and inconsistencies plaguing RT3D job definitions. UJPs are industry-vetted, detailed overviews of specified roles within RT3D industries. UJPs serve as comprehensive manuals detailing vital job elements such as role responsibilities, how positions fit within a studio structure, core skill requirements, commonly used tools, application prerequisites, interview processes, and learning resources. In essence, UJPs outline clear and structured career pathways tailored specifically to various roles within RT3D industries.The Employer Advisory BoardThe creation and refinement of UJPs is overseen by Elevate’s Employer Advisory Board (EAB), a diverse collective of experts representing industry-leading companies from all parts of the real-time landscape. These industry leaders provide nuanced insights and feedback, ensuring the UJPs accurately reflect current and emergent RT3D employment trends. The EAB’s advice guarantees that the information within the job profiles accurately reflects industry requirements, equipping both job seekers and educators with the most relevant and actionable information.Streamlining the path to employmentFor aspiring professionals, UJPs act as a map to plan their career trajectory clearly and confidently. These profiles guide job seekers through the landscape of RT3D industries, ensuring that by the time they complete the recommended areas of study, they emerge as job-ready candidates. The UJPs’ actionable checkpoints allow individuals to track progress and acquire skills pertinent to their desired roles.Enabling educators to craft future-ready curriculaEducation institutions are often challenged by the rapid pace of technological advancement in RT3D industries. Elevate alleviates this strain by providing educators with up-to-date UJPs, enabling the design of curricula that are synchronized with the market’s demands. UJPs aid educators in crafting learning experiences that align with real-world expectations, ultimately leading to a classroom-to-career transition that is smooth and successful. Empowering Employers to find the best talentEmployers also reap the benefits of UJPs by using them as benchmarks for job listings and candidate evaluations. In-depth overviews of roles facilitate a more efficient hiring process, helping employers find the best match for their team quickly. Employers seeking to influence the evolution of UJPs can also join the EAB, contributing to the program’s future direction and maintaining the initiative’s industry-leading status.A collective effort for an inclusive futureThe Elevate program is an ambitious and inclusive endeavor. It transcends tool preferences and industry segments, offering visibility and access to everyone interested in RT3D opportunities. Unity is committed to broadening the path to success in RT3D industries for all, and Elevate stands testament to that commitment.Discover the full scope of the Elevate program and how it can transform your journey in the RT3D industry by visiting the Elevate landing page. Whether you’re a job seeker, educator, or employer, become a part of this groundbreaking initiative and contribute to shaping the future of our industry. With Elevate, we’re not just finding jobs; we’re elevating careers.

>access_file_
742|blog.unity.com

Games Made with Unity: July 2024 in review

It’s July. The sun is out. Summer is in full bloom. It’s a great time to stay inside and get your hands on the plethora of brand-new games made with Unity that released this month.By the way, have you followed our new Steam Curator page? Last month, we kicked off a poll on X for the next theme for a list of games to put on the page. Unsurprisingly, cats won out.Check it out!Working on a game in Unity? We’d love to help you spread the word. Be sure to submit your project. Without further ado, to the best of our abilities, here’s a non-exhaustive list of games made with Unity and launched in July of 2024, either into early access or full release. Add to the list by sharing any that you think we missed.Action and CasualMetal Slug: Awakening, Tencent (July 16)Hamster Playground, Mass Creation (July 11)OutRage: Fight Fest, Hardball Games Ltd (July 16)ComedyExhausted Man, Candleman Games (July 24)Thought Experiment Simulator, HoHo Game Studio (July 22)RAWMEN: Food Fighter Arena, ANIMAL (July 23)FPSAnger Foot, Free Lives (July 11)MetroidvaniaBō: Path of the Teal Lotus, Squid Shock Studios, Christopher Stair, Trevor Youngquist (July 17)Gestalt: Steam & Cinder, Metamorphosis Games (July 16)Frontier Hunter: Erza’s Wheel of Fortune, IceSitruuna (July 26)Narrative and MysteryThe Star Named EOS, Silver Lining Studio (July 23)Vampire Therapist, Little Bat Games (July 18)The Operator, Silver Lining Studio (July 22)PlatformerSCHiM, Ewoud van der Werf, Nils Slijkerman (July 18)Moen, Ambient Melancholy (July 12)Valley Peaks, Tub Club (July 24)Puzzle AdventureSlider, boomo (July 24)The Abandoned Planet, Dexter Team Games (July 14)Linkito, Kalinarm (July 23)Été, Impossible (July 23)Arranger: A Role-Puzzling Adventure, Furniture & Mattress LLC (July 25)Ogu and the Secret Forest, Moonlab Studio, Sinkhole Studio (July 29)Roguelike/liteUnion of Gnomes, Hoolignomes (July 18 – early access) Valefor: Roguelike Tactics, Valefor Ltd (July 19)Little Scavenger, CodeRed Studio (July 27)Deathless, OneTwoPlay (July 29 – early access)Towerful Defense: A Rogue TD, Mini Fun Games (July 29)RPGDungeons of Hinterberg, Microbird Games (July 18)Zenless Zone Zero, miHoYo (July 4)Yaoling: Mythical Journey, RAYKA STUDIO (July 16 – early access)Minds Beneath Us, BearBoneStudio (July 31)SimulationFarlands, JanduSoft, Eric Rodríguez (July 24 – early access)Contraband Police Mobile, PlayWay SA (July 9)The Last Alchemist, Vile Monarch (July 12)Critter Crops, Skyreach Studio (July 22)StrategyCataclismo, Digital Sun (July 22 – early access)Artisan TD, 4rtisans (July 22)ARC SEED, Massive Galaxy Studios (July 30 – early access)Survival7 Days to Die, The Fun Pimps (July 25)That’s a wrap for July 2024. Want more Made with Unity and community news as it happens? Don’t forget to follow us on social media: Twitter, Facebook, LinkedIn, Instagram, YouTube, or Twitch.

>access_file_
744|blog.unity.com

Unity 2021 Long Term Support Extended

To ensure customers are able to create games with sufficient development time before upgrading, projects currently being built in Unity 2021 Long Term Support (LTS) will have extended LTS support until the release of Unity 6, later this year.Officially, support for Unity 2021 Long Term Support (LTS) ended in May 2024, two years after its release in 2022. This is aligned with the processes as outlined in the documentation under which we release our Unity Engine versions.As part of our ongoing efforts to ensure our users always have the right options for their project, we assessed the current state of support for all of our released (and soon-to-be released) versions of the Unity Engine.Here’s the rundown:Unity 2021 LTS. Fully supported up to the release of Unity 6, later in 2024. Enterprise customers will receive an additional year of support.Unity 2022 LTS. No change. Fully supported until at least May 30, 2025. Enterprise customers will receive an additional year of support.Unity 6 Preview. No change. Fully supported as a Tech stream release until the release of Unity 6, later in 2024.Unity 6. Fully supported for at least two years from its release. Enterprise customers will receive an additional year of support.When we announce the date of release for Unity 6, we will update this list with specific dates for Unity 2021 LTS and Unity 6.Providing peace of mind to our users is a top priority. This change in support for Unity 2021 LTS allows studios to continue to work on their current project, while looking forward to building their next project in Unity 6.As always, we encourage feedback. Please visit Unity Discussions to keep the conversation going.

>access_file_
746|blog.unity.com

Implementing ads without cannibalizing subscription conversions: A brief guide by ad format

2024 has seen many premium subscription service apps expanding their business models to incorporate an ad-tier into their offerings.At first glance, this shift makes sense - traditionally only 3-4% of users are likely to subscribe to a premium subscription-based app. Ads offer premium apps and streaming services a way to monetize the remaining 96% of users who would otherwise not generate revenue. While converting users to subscribers still offers the highest ROI for these apps, they would leave significant revenue on the table without ads.Still, some apps are hesitant to incorporate ads into their monetization strategy. Beyond more general concerns about ads causing churn due to a negative experience, there is also a concern that an ad-based tier would cannibalize subscription conversions. The reasoning is that if a user can access an app’s services through an ad-tier, they won’t be incentivized to purchase a subscription.But with the savvy implementation of an ad-based tier, subscription cannibalization can be avoided, as well as exposing an even greater cohort of users to the benefits of your app’s premium content or services, perhaps leading to more subscription conversions down the line. Below, we go over what you need to know about implementing ads without cannibalizing subscription conversions, broken down by ad format.How to implement ads, by format:1. Display adsDisplay ads are one of the most widely used types of ad formats, including formats like banner, MREC, native, and splash ads (splash ads are pop-up ads that trigger when users open their app). Their popularity is often attributed to their ease of use and unobtrusiveness - display ads require minimal development work from publishers and do not overtly disrupt app usage. Users experience display ads as digital ‘posters’ and are still able to use the app normally when display ads are on screen.As minimally disruptive as display ads are, they’re system-initiated, so users can’t opt out. As a result, there is still some risk of users bouncing. To prevent this, users should be primed to expect ads with messaging related to the tiers - the ad tier where they’ll receive some premium features for free with ads, and the premium tier where they can access all the features of the app without ads.Priming users that ads are present can help to avoid churn since users are less likely to see the ads as intrusive as they’ll be expecting them. Notifying users that they’re on an ad-tier can also work to incentivize subscription conversions - ad-tier users get a taste of premium content, which may make them want to subscribe to unlock the full experience and receive all premium features. Another option for an ad-tier is to give users full premium features but with ads with an option to access an ad-free experience by subscribing.Preferably, users should be primed with a notification from the start of their app experience. A good place is in the sign-up flow since this is when they’ll have the option between using a subscription or ad-based tier for the first time. Moreover, a sticky notification in their account settings is another great place for the notification. There should be a CTA alongside the notification to become a subscriber, which can work to convert users who initially chose the ad-tier of your app.2. InterstitialsInterstitials offer even better revenue generation potential than display ads but can be more intrusive. Like display ads, interstitials are system-initiated, but unlike display ads, users can’t keep using the app until they have either completed or dismissed the ad. So, implementing them correctly is even more important.Like with display ads, priming users is essential. And since interstitials can interrupt the user experience of the app, it’s doubly vital to prime them that ads are present.3. Rewarded videosRewarded videos (RVs) are one of the best ways to monetize users, since, like some interstitial ads, they are a more engaging 15-30 second video, but unlike interstitial ads, RVs are user-initiated. In other words, users opt-in to watch the ad until completion in return for access to in-app currency or content. This makes rewarded videos premium placements with high revenue generation potential - RVs incentivize higher engagement and so advertisers are willing to bid more for them.Thanks to this, RVs can actually positively impact your conversion and retention rates. They enable you to give users a taste of premium content in exchange for watching ads. Some users will want more of the premium content and subscribe, while others, who may have otherwise churned, will stay for the premium content they received from the rewarded video.The primary difficulty with RVs is that they come with some development needs. To implement them, you need a way to categorize content so that it can be exchanged for ads watched. With the right resources and expertise this is entirely possible (Unity has an in-house dedicated consulting team to help publishers accomplish this), but it does take some work.4. OfferwallOfferwalls take the value exchange-driven engagement of RVs one step further, offering users in-app currency or unlockable features for not just watching ads, but also completing tasks in other apps. Users can be tasked using a range of offers, like downloading another app, making an in-app purchase there, or progressing far enough in terms of levels or engagement in that advertised app. Like RVs, offerwalls are also an opt-in, user-initiated monetization strategy, meaning that they are less likely to cause users to churn because users are actively choosing to engage with them.However, just like RVs, there is some development work required. To implement an offerwall you would also need to categorize your features and content. But, on top of that, offerwall implementation also requires you to have some form of in-app currency that users can receive in exchange for completing tasks. Users also then need a storefront in your app where they can spend the in-app currency they earn.Though the requirements of offerwalls can be steep, if you can implement one properly, it can offer a key way to diversify your monetization strategy, giving you a way to monetize highly engaged users who are committed enough to engage with outside offers to access premium content in your app, but still might be on the fence when it comes to purchasing a subscription.Ultimately, all ad formats have a lot to offer in terms of revenue generation and diversifying your monetization strategies. The right one, or the combination of ads, will depend on your app and audience. But, regardless of which ad format is right for your app, all ad format implementations share one commonality - the importance of using segmentation to prevent cannibalization.Segment users to prevent cannibalizationFor a subscription app diversifying into ads, it’s critical to use a monetization platform that allows you to segment users, ideally by region, device model, OS, and more. These segmentation options enable you to tailor your ad implementation to ensure that high-potential users get an app experience that drives them to convert, whereas users who are less likely to convert to a subscription package are routed to an ad-based tier.For example, users from a tier-1 region, like the US, are more likely to convert than those from tier-2 regions like LATAM, so segmenting tier-1 users out of the ad-based tier will help to prevent losing high-quality users who might have otherwise become subscribers.With a monetization platform that enables you to segment users in this way, you stand the best chance of reaping the rewards of implementing an ad monetization strategy without the cost of cannibalization - especially when used in conjunction with priming.

>access_file_
747|blog.unity.com

Pixyz: What's new

Pixyz is our data optimization solution designed to ingest, optimize, and convert CAD or 3D assets with tessellated meshes from almost any engineering or design software for use in any 3D staging, rendering, or visualization environment. The Pixyz Portfolio has historically included Pixyz Plugin, Pixyz Studio, and Pixyz Scenario Processor. Please read on to learn about significant improvements and a few licensing changes being made to the Pixyz Portfolio soon.Pixyz Scenario Processor was created to ease the deployment of large-scale data preparation workflows prepared and tested in Pixyz Studio. While Pixyz Scenario Processor has proven to be a valuable tool, it needed to be accessed from the Command Line, which limited flexibility when it came to automation.Today, we are announcing that Pixyz Scenario Processor will evolve into a new, more flexible and powerful toolset called Pixyz SDK or “Software Development Kit”. As the primary offering for developers in all industries, Pixyz is now available as a standard library (Python, C# .NET NuGet), to be used in your favorite IDE (PyCharm, Visual Studio code, etc). This allows for faster, more efficient integrations within our users’ infrastructures and applications. Pixyz SDK will also provide cloud-ready tools (docker image) to simplify the deployment of data pipelines in your private Cloud or local server infrastructure. The purpose of this shift is to better serve our trusted Unity and Pixyz customers handling complex, on-premise, data transformation pipelines.Starting July 24, 2024, Pixyz SDK will become available as the successor to Pixyz Scenario Processor. A minimum acquisition of two nodes will be required for Pixyz SDK. There will also be a grace period for current Pixyz Scenario Processor subscribers. Subscribers with less than two nodes will have 6 months after launch, from July 24, 2024 to January 24, 2025, to renew their license. After that date, all customers will be required to purchase two nodes at a company level, or consider transitioning to our Unity Cloud Automation offering.Our experience working with Pixyz Scenario Processor customers over the years demonstrates that it requires effort to properly set up an on-premise data pipeline and that users get more value when starting with at least two concurrent executions of Pixyz. For instance, many customers require both a staging and production environment, in order to make changes in the staging environment before pushing to production.Unity Cloud Automation and 3D Transformation services will be available later this year for customers who do not wish to purchase two nodes or do not require an on-prem solution.For more information on pricing or your subscription, please contact our sales team. To better understand the differences between Pixyz Scenario Processor and Pixyz SDK, how to migrate data pipelines, and how to access documentation, please visit our FAQ on the Pixyz SDK homepage.As Pixyz SDK is being launched along with a dedicated UI* as the companion app to support the scripting experience, Pixyz Studio naturally changes its position in the Pixyz product stack. From version 2024 onwards, Pixyz Studio will focus on interactive data preparation tasks.Pixyz Studio 2024 launches today in Beta and continues to feature a Python API interface. This enables the creation of simple scripts and plugins designed to enhance its out-of-the-box capabilities and create custom actions to help users work more efficiently. However, Pixyz Studio plugins will not be recommended as compatible with Pixyz SDK.Finally, PyQt native support has been discontinued, so plugins are limited to what the XML structure can offer. Customers wishing to create advanced interfaces should use the full power of Pixyz SDK.On September 3, 2024, Unity will launch a new version of the Pixyz Plugin package. While all main features for CAD and 3D import and data preparation remain unchanged, they have been completely revamped and simplified in order to provide a better user experience that remains faithful to the usual way of working with Unity Editor.With the latest release of the Pixyz Plugin package, Pixyz Plugin will be automatically entitled by Unity Industry and will therefore no longer require a separate license. Standalone Pixyz Plugin license purchases will be discontinued beginning September 3, 2024 and Pixyz Plugin will only be available as part of the Unity Industry bundle.Unity aims to simplify the onboarding process for Industry customers with this change, particularly concerning seat management with Pixyz Plugin. This means access to the Pixyz Plugin package will be automatic for all Unity Industry seats. As the first of many steps to enhance the overall user experience for Industry customers, the Pixyz Plugin package will be continuously upgraded to offer more features that further unlock industrial use cases.Find more information about this important upcoming change in our FAQ here.

>access_file_
748|blog.unity.com

E-book update: More design patterns and SOLID principles

Back in the fall of 2022 we launched the e-book Level up your code with game programming patterns, together with a GitHub repository with sample code. We also released a five-part video tutorial series to accompany the e-book and sample project.We've received great feedback from you on these resources, with many of you asking us to cover additional design patterns. Thank you for sharing your feedback. My team and I follow your comments closely and we really appreciate it. Today, I’m excited to announce that an updated edition of the e-book, Level up your code with design patterns and SOLID, is now available, with an updated version of the design patterns sample project, which you can download from the Unity Asset Store.Both the e-book and the sample project are now based on Unity 6 and include more examples and patterns. The sample project also includes more features from UI Toolkit, including an example that demonstrates databinding, a popular request from the community.Note: Unity 6 will be available later this year. If you want to follow along with the examples in the guide, and the accompanying demo project, make sure to download Unity 6 Preview.Before diving into the new content in the e-book, some of you who are less familiar with the concepts might wonder: Why should I learn about design patterns, and how do they fit into Unity game development?Coming back to your feedback, while the fundamentals of object-oriented programming are familiar to many, applying these principles in your own code can sometimes feel abstract and overly academic.Think of it this way: For every software design issue you face, countless developers have encountered similar challenges before you. Although you can’t always ask them directly for advice, you can learn from their solutions through design patterns.Design patterns offer general solutions to common problems in software engineering. They aren’t ready-made templates to copy and paste into your code, but rather tools in your toolbox to draw upon when needed. Some patterns are more intuitive than others, but each one can be useful in each context.We created this guide for those who are new to design patterns or just need a refresher. It outlines common scenarios in game development where these patterns can be applied. If you're transitioning from another object-oriented language like Java or C++ to C#, you'll find practical examples of how to adapt these patterns specifically for Unity.At their core, design patterns are simply ideas. They won’t apply to every situation, but when used correctly, they can help you build scalable applications. Integrating them into your projects will enhance code readability and maintainability. As you become more familiar with these patterns, you'll identify opportunities to streamline your development process.In short, our guide is designed to elevate your coding skills and create better Unity projects and establish an understanding of general industry best practices that you can carry with you throughout your career.Let’s look at the key new additions to the design patterns resources:An expanded section on how to implement SOLID principlesThe five core principles from SOLID now each have actionable code examples implemented in the sample project that are explained in the e-book. As a reminder, SOLID is a mnemonic acronym for five core fundamentals of software design – think of them as five basic rules to keep in mind that can help you keep object-oriented designs understandable, flexible, and maintainable.As a quick reminder SOLID stands for:Single-responsibility principle: A class should have only one reason to change, meaning it should only have one job or responsibility.Open-closed principle: Classes should be open for extension but closed for modification, allowing them to be extended without changing existing code.Liskov substitution principle: Objects of a superclass should be able to be replaced with objects of its subclasses without affecting the correctness of the program.Interface segregation principle: Clients should not be forced to depend upon interfaces that they do not use. It promotes the creation of specific interfaces over a single, general-purpose interface.Dependency inversion principle: High-level modules should not depend on low-level modules, but both should depend on abstractions.The key takeaway from diving into the examples is that following the principles can help you achieve the following benefits in your game development:Readability: Clear and well-organized code facilitates efficient comprehension of project functionality. Adhering to SOLID principles can enhance code readability; when your code standards are consistent, you boost the chance of smooth collaboration between game programmers on a team. Scalability: Implementing SOLID principles fosters maintainable code, which is crucial for projects that you want to scale. By adhering to these principles, changes made in one part of the codebase are less likely to introduce unexpected issues elsewhere. This approach ensures code remains flexible and adaptable to evolving requirements.Velocity: Ultimately, SOLID principles contribute to improving game development workflows. Modular code, a key aspect emphasized by SOLID, involves breaking down systems into smaller, manageable components. This modular approach facilitates easier testing, debugging, and code reuse across projects, reducing development time and enhancing productivity.The updated e-book and project include four new patterns, bringing the total to 11. Here’s a quick rundown of each one:Factory pattern: A classic use case is when you have powerups (such as speed boosts, shields, or extra lives), which share several attributes yet have different functionality. Here the factory pattern can be used to create instances of these different powerup classes derived from a common interface or base class, enabling flexible addition of new power ups without modifying existing client code.Object Pooling: Some would refer to this as a performance optimization technique rather than a design pattern. In any case, think of it as a way to improve performance by reusing objects instead of creating and destroying them frequently. In our sample scene you will find an example of a gun turret firing large amounts of bullets at rapid speed. Rather than instantiating them (and cleaning up once they served their purpose at significant performance cost) each time we use the pattern to recycle them over and over.Singleton: The singleton is likely one of the common patterns in game development – chances are you are already using it today. It’s useful if you need to have one object that coordinates actions across the entire scene. For example, you might want one game manager in your scene to direct the main game loop. However, there are some pitfalls to watch out for when using the singleton pattern, which we explain in the guide.Command Pattern: You’ve likely seen the command pattern at work if you’ve played a game that uses undo/redo functionality or keeps your input history in a list. It’s a pattern you can leverage for a strategy game, for example, where the user can plan several turns before actually executing them in the order the input was given.State Pattern: This allows an object to change its behavior when its internal state changes, which simplifies the management of complex state-dependent behavior in game characters or UI elements. Think of an enemy NPC that has different behaviors such as “idle”, “patrolling”, or “attacking” which depends on different game scenarios such as where the player is on the map.Observer Pattern: This pattern helps you implement an efficient event system where objects can subscribe, and react, to events dynamically. One use case is that of a player collecting ammo in an action game that triggers different events such as playing a sound, updating the UI, and playing an animation.Model View Presenter (MVP): At its core this pattern is about decoupling the display of state from the actual state, enabling a reactive design where views automatically update in response to model changes, making it a common pattern in UI programming. The model is the data, the view the user interface, and the presenter a mediator which handles the logic for the view and synchronizes the data from the model.Model-View-ViewModel (New): Like the name indicates this one is related to the MVP pattern but expands it by adding runtime data binding which simplifies how UI elements are updated. In our example we leverage the new data binding feature in UI Toolkit and Unity 6 Preview.Strategy Pattern (New): This pattern defines a family of algorithms by encapsulating each one, to make them interchangeable, allowing the algorithm to vary independently from clients that use it. This is a useful pattern for implementing different movement behaviors in game AI, for example.Flyweight Pattern (New): Use this pattern to optimize memory usage by sharing as much data as possible with similar objects. The basic idea is that you centralize the shared data among objects.Dirty Flag (New): This pattern is useful for optimizing performance by marking objects as "dirty" when they change, so they are only recalculated or updated when necessary. This pattern can help you manage costly updates in game loops or in some UI rendering cases.The sample project mirrors the e-book by demonstrating each of the 11 patterns in action. You can download the project from The Asset Store and follow along with the corresponding scenes to see these patterns applied in real-world scenarios. Note that the project requires Unity 6 Preview or later.Before you jump into the project, there are a few helpful tips to keep in mind.Start with the Bootstrap scene. This scene configures the demo and provides access to the main menu (you can learn more about the concept of SceneBootStrapper in the e-book). From the main menu, you can navigate to the appropriate sample. Each scene demonstrates a different SOLID principle or design pattern.Please note that there may be minor differences between the sample project and the code examples in the guide. To enhance clarity and readability, some examples feature simplified code like public fields.Your team might prefer a coding style different from the conventions used in this guide or the sample project. We recommend creating a C# style guide tailored to your specific needs and following it consistently across the team. Check out our e-book on how to create your own style guide to learn more.Consider the examples provided and determine which design pattern aligns best with your project needs. As you familiarize yourself with these patterns, you'll discover their potential to streamline and improve your development workflow.Both the e-book and sample project on the use of design patterns are available to download for free:Happy coding!

>access_file_
751|blog.unity.com

Multi-Game Strategies

During Project Reviews as a consultant for the Customer Success team, I often work with customers that create game-switching applications. These applications have one main menu or theme menu, presenting multiple choices of games for the player to choose. In those setups, the main concerns are how to ensure that the time between switching games is as short as possible and how to ensure optimal performance across the games. In this blog post we will explore different approaches based on project needs as well as some best practices that can be useful for any game environment, with or without a game-switching setup.When planning for a multi-application environment–whether for gaming, entertainment, or industrial simulation–the most important decision to make is how to manage game executables. There are many factors that can influence this decision:How many games will the platform handle?How big are the games?Are the games made with the same Unity versions? What are the application's bottlenecks?Other factors are target hardware, memory and CPU, and disk speed (SSD vs HDD vs SD Card).Answering these questions and deciding how to handle executables is crucial to understand whether we need separate executables for each game; one shared executable for multiple games, or a combination of both to ensure the applications perform optimally.Having multiple executables is a great option to handle games that are made with different Unity versions. With this approach it’s possible to reduce the time to switch between games by caching the executable in the memory, and leaving each instance in the background. However, keeping all executables in the memory is not always the best choice since it can be straining on memory. It should be avoided in cases where the individual games have a higher memory footprint, and/or when there are many games in the game switching application.To ease memory constraint, it is possible for games to share a single executable. The games can be in a single Unity project, or each have their own project, as long as the games share the same Unity version. Since Unity 2022 LTS in Windows it’s possible to use the -datafolder argument to pass a variable path via command line ( -datafolder ), specifying the selected games data folder in order to switch change. One potential disadvantage of this approach is slower game switching times; therefore it’s important to follow loading best practices to reduce this drawback.No matter the nature of the game we’re developing or on which platform, it’s important to spend as little time as possible from the moment of game selection until it’s fully loaded on the screen. This goal becomes particularly important for game switching applications.A great way to handle loading is by using Addressables. With Addressables, contents are downloaded and released on a need basis. This deferred loading strategy is the most efficient way to reduce load times for games since it limits the amount of data that has to be loaded during initial startup. Furthermore it can help prevent any CPU background activities related to background games, which can contribute to CPU bottlenecks. Addressables: Planning and best practices blog post is a great starting point to learn more about addressables and how they can help improve your game.A great way to ensure faster loading, regardless of how many executables we’re using, is via the asynchronous loading APIs. When loading asynchronously, the Unity main thread will execute a process called “main thread integration” which is responsible for the initialization of native and managed objects in a time-sliced manner. Since this process performs some operations that are not thread-safe it will occur on the main thread, and the time allowed to execute the main thread integration is limited to prevent the game from freezing for a long time. The amount of time that can be spent on the integrations is defined by the Application.backgroundLoadingPriority property. We recommend setting the backgroundLoadingPriority to High, or 50 ms, during loading screens and then returning it to BelowNormal (4 ms) or Low (2 ms) when loading is complete.An additional way to speed up loading is via Asynchronous Texture Upload. Async texture load can decrease the amount of load time by coordinating how much time and memory is used for uploading textures and meshes to the GPU setting. The Understanding Async Upload Pipeline blog post provides detailed information on how this process works.These practices will help speed up loading times:Minimize your scene content as much as possible. Use a bootstrap scene to load only what’s needed for the game to be in a playable state, then load additional scenes when needed.Disable cameras during loading screens.Disable UI Canvases while they are being populated during loading.Parallelize network requests.Avoid complex Awake/Start implementations and make use of worker threads.Always use texture compression.Stream large media files (like audio files and textures) instead of keeping them in memory.Avoid JSON Serializer, and instead use binary serializers.As mentioned earlier, memory is not the only concern for multi-game environments, background CPU activity is also something that can put a toll on the player's gaming experience. When games are not actively being played, their CPU is still running, causing the active game to perform suboptimally by creating CPU starvation. A way to prevent CPU starvations for the active game, and any other backend platform processes is to set the Run in Background player to false in Unity Settings. Run in Background will cause the Unity game loop to stop while the game is not in focus. The setting can also be changed dynamically via scriptOne thing to note is the Run in Background setting won’t stop any custom scripting threads from running, so it’s important to set to sleep any threads of non-playing games via the Thread.Sleep C# method. Remember that working with background threads in Unity requires careful programming. Since these threads don't have direct access to Unity's API, there can be a greater chance of creating issues, such as deadlocks and race conditions. Preventing this requires proper synchronization with the main Unity thread. To properly implement multi-threading, review the Limitations of async and await tasks section of the Overview of .NET in Unity manual page and the MSDN article about using threads and threading. Unity 6 introduces Awaitable class which offers better support for async/await.It can be difficult and time consuming to identify and fix the causes of memory leaks, especially in the later stages of development. As cliche as it may sound, prevention is always better than the cure. Here are a few recommendations that can help prevent leaks in any game environment:When creating new objects/assets in memory, make sure to delete them when not needed. If using Addressable, make sure to release unused assets.When loading/unloading scenes, assets should be properly removed from memory. Unity doesn’t automatically unload assets when a level is unloaded, therefore it’s important to make sure to remove any access from the memory. The Resources.UnloadUnusedAssets API can help clean up assets. However, it can cause CPU spikes, since it returns an object that yields until the operation is complete, therefore it should be used in non-performance-sensitive places.Avoid frequently using Instantiate and Destroy GameObjects. Doing so can lead to unnecessary managed allocations, while also being a costly CPU operation. However, in cases where using Destroy is necessary, make sure to remove all references to the object to avoid Leaked Shell Objects. When an object or its parents are destroyed via Destroy, a C# code holds a reference to a Unity Object, keeping the managed wrapper object–its Managed Shell–in memory. Its Native Memory will be unloaded once the Scene it resides in is unloaded, or the GameObject it is attached to or its parents are destroyed via Destroy. Therefore, if something else that was not unloaded still references it, the managed memory may live on as a Leaked Shell Object.Be mindful when implementing events using Singletons. Singleton instances hold references to all objects that have subscribed to its events. If those objects do not live as long as the singleton instance, and they do not unsubscribe from these events, they will remain in memory causing a memory leak. If the event source gets disposed before the listeners, the reference will get cleared, and if the listeners are properly unregistered there is also no reference remaining. To solve and prevent this problem, we recommend implementing the Weak Event Pattern or IDisposable in all objects that listen to singleton events, and make sure they are properly disposed of in your code. The Weak Event Pattern is a design pattern that helps you manage memory and garbage collection in event-driven programming, particularly when it comes to long-lived objects. It's especially useful when you have subscribers that are short-lived, but the publisher is long-lived. Please keep in mind these are C# specific solutions and work only with C# events and are not directly supported by UnityEvents or the Unity UI Toolkit. As such, we recommend implementing these solutions only in your non MonoBehaviour scripts.Lastly, profiling, performing CI/CD testing and stress testing from the early development stages can be a real time saver, since detecting leaks as they arise will allow you to promptly address the issue, saving time in debugging, and ensuring optimal performance.

>access_file_
752|blog.unity.com

The 16th Unity Awards are here

We are thrilled to announce that the Unity Awards are back and it's set to be a celebration unlike any other.For the last 16 years, we’ve come together to recognize the phenomenal achievements of creators who have leveraged the power of Unity to bring their visions to life. This time around, things will be a bit different. We’re making this a global event and adding new categories for a total of 24 awards that celebrate the work of Unity creators. We’ll be announcing the winners during our first Unity Awards Showcase that will be broadcast later this fall.Today, we’re kicking things off with a call for nominations in several categories including Games, Asset Store, and Community. We’ll follow this up by announcing the final list of nominees at Unite Barcelona, so stay tuned and be sure to check out our keynote presentation on September 19.To submit your nominations, please visit our official Unity Awards page to fill out our simple form. Don’t forget to share the love on social media and encourage others to nominate their favorites as well. This is your chance to make your voice heard and shine a spotlight on the experiences and developers that have wowed you this year. Whether it's a mesmerizing indie game, a revolutionary new tool, or a heartwarming educational project, we want to ensure that every corner of our vibrant community is represented. Don’t hesitate to nominate those hidden gems from all around the world.Let’s come together to celebrate the magic of Unity and make this the most spectacular Unity Awards yet.

>access_file_
753|blog.unity.com

How to author Scenes and Prefabs with a focus on version control

The goal of this guide is to provide best practices for authoring content in Unity that works with version control. For more information about using version control, check out Unity’s Take on version control for stronger collaboration blog or the more in-depth Best practices for version control e-book. Though both of these resources contain good general information for working with version control, the focus of this blog is on how content integrates with version control and how to avoid merge conflicts when many creators are working on the same or adjacent content at the same time.There is a lot of confusion online about .meta files and version control. .meta files should always be checked into version control. They contain important information, such as the file GUID that connects all references between assets. They should be kept in sync with their source files (both the name and location should always match the associated source file). Never move or rename an asset file outside of the Unity Editor unless specific tools have been built for this purpose or the functionality of .meta files is completely understood.The default version control .meta file mode is Visible Files, which shows the .meta files on disk in the operating system, rather than hiding them. If you’re using Perforce, select the Perforce mode.The first thing any team should do when working with Unity and version control is set up Smart Merge. By default, Unity stores YAML files as text, which makes them mergeable in version control. Changing the Asset Serialization Mode to binary will remove the ability to merge these files by version control.Because Unity’s YAML files are text-based, a lot of version control merging software will try to merge them using text or coding rules. Smart Merge is built by Unity to merge with the YAML structure in mind. We recommend enforcing the usage of Smart Merge as the default merging tool for all YAML files.Smart Merge will greatly reduce the amount of lost work due to merge conflicts, but if your team has zero tolerance for potential lost work, we recommend also using file locking and enforcing manual merge conflict resolution for any YAML files. File locking is a common practice for large studios where multiple content creators might work on the same binary file (or YAML file) at the same time. Unity version control prevents anyone from checking out a locked file. Perforce prevents anyone from submitting a locked file.Git requires Git LFS to be initialized in order to support file locking. We recommend using Git LFS for any project that has large content files and uses Git for version control.Successful teams often have strict guidelines regarding naming conventions, folder structures, where assets should be located, and how assets should be edited. Because every team is different, there is no one set of universal guidelines. Picking the right system for your team and sticking with it for as long as it works is the most important thing.Specific, strict guidelines makes developing tools to verify content simpler. This prevents bugs before they even get into the game. Validating content with code and tools becomes much easier with clarity on asset location and naming. You can then more easily enforce Unity import standards via Asset Postprocessing and Presets.When building content, it’s important to utilize the separation of concerns principles. Thinking about how content should be divided and where it should live will keep the project clean and prevent most content creators from running into merge conflicts. It can also help with discoverability and onboarding feature experts, where individual features live in specific sub scenes or Prefabs. The main building blocks that can be used to separate content into files are Scenes, Prefabs, and Subgraphs.Scenes are the macro building blocks for any Unity application. Each scene is serialized (saved) to a file, which means they can be used to organize content in a way that is friendly for source control and simultaneous editing. Additive Scene Loading is often used effectively to create very large scenes, streaming content, or even very simple scenes that have multiple components with separate concerns.In general, we recommend storing scene-dependent data in scenes. For example, in the case of baked lighting, lights, lightmaps, and environment settings, all are dependent on which scene they are in. Almost everything else can be stored in Prefabs. If a scene is small and has specific content in it that will only be edited within that scene, it might make sense to store all of that data in a single scene. However, it is important to note that if there are two GameObjects in a scene, any changes to one of those objects will result in changes to the scene file.While Smart Merge should handle the scenario where two content creators change two different objects in the scene at the same time, more elaborate changes can result in unresolvable conflicts. We recommend utilizing Prefabs to help mitigate this issue.In terms of explicit scene structure, the Unity Netcode Demo contains a useful flow chart of a standard scene layout for a simple project that is similar to those we’ve seen in many scenarios. Prefabs can be used to build modular, separate content. For more information, check out this tutorial on Prefabs and Nested Prefabs. The most important section in that tutorial is the Best Practices section. There aren’t any explicit rules about building content with Prefabs. Each team will have to decide what works best for them based on their project. There are, however, a few good guidelines to follow:Think of Prefabs as the building blocks of your house, or project. Generally, there is a root Prefab that represents the foundation of the house. Within that are the Prefabs for each reusable component that make up the rest of the house. They could be as granular as a windowsill, or as broad as a wall with windows in it. The level of granularity needed will depend on how content creators want to edit their Prefabs. Prefabs can be nested. This means that in the example above, there might be a house Prefab, with wall, roof, window, and door Prefabs that compose the house. One important thing to note with nesting Prefabs is that the deeper the hierarchy is, the more likely it becomes that the project will encounter performance issues. We generally recommend keeping Prefab hierarchies below 5-7 levels of depth. When nesting Prefabs, it is generally a good idea to edit the Prefabs in Prefab Mode. This guarantees that the Prefab properties or overrides are set in the proper location. Editing Prefab properties in the scene view can result in overrides residing in the wrong Prefab or in the scene itself. This can have unintended consequences and cause merge conflicts. Sometimes, it is necessary to override a child Prefab property in a parent Prefab (variations can be achieved this way without affecting every reference to a specific Prefab). This is a standard workflow, but it is important to be careful to make changes and apply overrides in the proper Prefab or scene.Everything in a Prefab will be loaded and instantiated into memory at runtime when a Prefab is loaded and instantiated. This means if visual effects, or attached objects that aren’t always present, are inside of a Prefab, those objects will be instantiated into memory. This can result in memory bloat, as every Prefab instantiated will instantiate everything inside of it. If objects have occasional visual effects or models attached to them, it is better to have a pooling system and an attachment manager that adds those components at runtime. Anything in a pooling system should generally not be placed inside a Prefab.Everything does not need to be a Prefab. Smaller building blocks can be GameObjects inside a Prefab or even a scene. If an object is unique to a scene or Prefab, there is no need to create a Prefab for it.Prefab variants should be used with care. Generally, the best use of a Prefab variant is when the core building blocks of an object are identical with only simple differences. For example, it might be helpful to use a Prefab variant for a game component that has identical functionalities, but different visuals. In this scenario, changing the core functionality will affect the functionality of both the Prefab and its variants, but the visual will remain overridden.Be careful about making overly-complex Prefab variants, as changes to the root Prefab could have unintended consequences on the overridden Prefab. As a general rule of thumb, something like a character variation system or any other complex visual skinning system, should not be based on Prefab variants unless the system is very simple.We recommend creating a consistent strategy for determining what should be Prefabs and what should be GameObjects within Prefabs or scenes. If a Prefab is created directly from an FBX file, a special kind of Prefab called a model Prefab is created. The result is a Prefab variant of the FBX file. Any additions or changes will be stored as overrides in the Prefab file. However, because most of the data is stored in the FBX file, changes and additions to the Prefab cannot be applied to model Prefabs. If you’re using the model Prefab variant workflow, it’s important to keep structural changes to a minimum.There are two alternative workflows that allow less tightly coupled structures between the FBX model and the Prefab:Adding the FBX file directly into a scene and then adding components or structural modifications onto GameObjects in the scene. In this scenario, any changes are now stored in the scene file. This can result in merge conflicts if many people are using this workflow in the same scene. We only recommend this workflow if the changes have to live in the scene and conflicts are not an issue.Creating a standard Unity Prefab out of the exploded model. In this method, the FBX model is dragged into the scene and then unpacked completely. This is then used to create a Prefab. This methodology completely decouples the FBX file from the Prefab. This is useful if very loose coupling is desired between the FBX file and the Prefab. It will no longer inherit any structural changes made to the original FBX file. The coupling will only be between the names of the meshes, materials, and animations. Everything else will reside in the Prefab file itself. This can be useful for creating entirely unique variants of an FBX model. For example if two characters use the same model, but need completely different meshes, materials, or even different hierarchies, this methodology might be better than creating multiple FBX files that take up extra memory and disk space.One thing to note with this method is that when a mesh or material name changes on the original FBX file, the object does not disappear, but instead, references a missing mesh or material. This can be handy when extremely complex component setups are required. Rather than having the GameObject disappear and lose all components, the object sticks around and components can be transferred to the new object, or the renamed mesh or material can be slotted back into the GameObject that is now referencing a missing mesh or material.In the two images below, changes are made to an FBX file and then reimported. The circle around the Unity logo on the ball is deleted. The main support stand is renamed. The material for the main ball is changed from black to green and a new parent is introduced above the stand logo with transforms on it that raise it above the model. This is all closely matched in both the FBX file and the model Prefab. In the non-model Prefab, the original hierarchy, names, and materials are kept. The deleted mesh is now a missing mesh, but the GameObject is still present. The renamed mesh is also not visible because it is referencing a mesh name that doesn’t exist in the model anymore. The changed material is not updated because the GameObject is still referencing the original material. Additionally, the hierarchy change is not respected and the mesh stays in the same place because its parent has not changed.In the before and after images below, the results of the changes made above are displayed in the scene hierarchy. The FBX file that is referenced directly in the scene and the standard model Prefab respond to any changes made to the original FBX file. The unpacked Prefab retains its original hierarchy and does not respond to deletions or name changes.It is important to make careful decisions about which methodology is used in a given scenario. If tight coupling is desired between the Editor and the FBX files, then the standard model Prefab is probably the best choice. If very loose coupling is desired (for example a very flexible character system where meshes or materials may be swapped out frequently), then creating non-model Prefabs with soft references to the components of the FBX file will work better.In the case of graph tools like Shader Graph or Visual Effect Graph, Shader Graph Subgraphs and Visual Effect Graph Subgraphs can be used to create reusable functional nodes that live in a separate file and can be edited without causing conflicts in every Shader Graph or Visual Effect Graph. This allows users to separate concerns in a similar way to Prefabs and scenes. We recommend creating a strategy for reusability by taking advantage of subgraphs where it makes the most sense.Avoiding deep hierarchies is a universal statement for content that relates to Scenes, Prefabs, GameObjects, Animation, UI, and anything else. In general, deep hierarchies of any kind tend to result in performance problems. Deep animation hierarchies in characters will result in fewer characters being able to be drawn on screen due to CPU performance concerns. Because of this, we recommend placing all animated hierarchies at the root node of the scene to help with performance. Opting for flat hierarchies when using UGUI and Canvases will result in better performance due to fewer cascading layout updates. Deeply nested Prefabs can cause performance problems and also lead to confusion when overrides are not carefully managed.We often see teams store source content in a variety of locations, from network drives to local machines. We recommend putting all important source content into version control in one way or another.The most common method we see is to put source content into a folder in version control that is outside of the Assets folder. This ensures that Unity does not try to import any content from the source files directly. Maya, 3ds Max, Blender, and Photoshop files will import automatically into models and textures if they are placed anywhere in the assets folder. While Unity does support this, we don’t recommend this practice. Additionally, we do recommend mirroring the source directory to the content in the Assets directory so that tracking assets is relatively easy.Source content should be maskable for users, as most users will not need it all and source content can be extremely large on disk (think terabytes). In some version control software, creating content masks is fairly simple. In Unity version control, this is achieved with cloaked files. In Perforce, Views are used to mask content from the client. Git, however, isn’t designed to work in this way. Because of this, we recommend creating a separate Git repository for source content or a separate repository for each type of content (for example, 3D artists may never need to sync the full audio source and vice versa).Creating content that works with version control and supports multiple users working in the same areas is a difficult task. However, Unity provides building blocks that can be used with careful thought and planning to create large-scale content that does not result in unresolvable merge conflicts or lost work.

>access_file_
754|blog.unity.com

Multiplayer Summer: Learn, build, and celebrate

It’s a summer of multiplayer gamesMultiplayer games and development have never been hotter, with 62% of the creators we polled stating they’re working on a multiplayer game. We can’t think of any better way to spend July and August. Read on for updates to all-new learning materials, webinars, product news, and live Twitch streams. Let’s dive in.Tune in for weekly upskilling and inspirationSubscribe to our webinarsRegister today to secure your spot in our free webinars to learn how to implement LiveOps to increase your player retention and use Unity 6 multiplayer features to speed up prototyping.Build a retention layer for your multiplayer game (July 25, 9 am PST)Accelerate your multiplayer prototyping with Unity 6 (August 1, 9 am PST)Join us in the Community and on TwitchIf you’re looking for something a bit more casual, join us on our Twitch channel.We’ll be hosting a LetsDev Multiplayer session on July 17, then kicking off a new series called NetCheck on July 30, where you can follow our teams converting last year’s Scope Check game into multiplayer. This series will be running through August.Finally, on August 14, we’ll host a day-long Blitz Day of question and answers on all our community channels with the multiplayer product teams who work across Unity Editor tools and supporting services.Get started with all-new multiplayer toolsMultiplayer creation in Unity 6If you haven’t yet checked out the Unity 6 Preview, there’s no better time: you’ll get access to the latest features to help with multiplayer game creation.This includes packages like Multiplayer Play Mode, which simulates multiple players within the Editor for testing; Multiplayer Tools, which now includes network scene visualization; and Dedicated Server, which helps you switch projects between server and client without needing to create a new project.On top of that, there are a few packages in an earlier state that include the Multiplayer Services Package, a one-stop solution for adding multiplayer elements to your game that combines capabilities from services into a single Sessions management system. Multiplayer Center provides you with recommendations on tools and services relevant to your project, and Distributed Authority for Netcode for GameObjects which helps achieve greater simulation complexity and run more players in a session without the complexity (or cost) of a dedicated server. To help you onboard to the Distributed Authority network and understand how to use Netcode for GameObjects, download the all-new Asteroids Bitesize Sample. Additionally, we have a new tutorial that’ll help you create a simple Hello World multiplayer project that includes the basic features of Netcode for GameObjects.Be on the lookout for these packages that will be moving to prerelease soon, and download the Unity 6 Preview today to get started.Make your own multiplayer wavesWe hope you’re as excited for Multiplayer Summer as we are. But if you just can’t wait, check out our roundup of multiplayer resources to get started. Throughout Multiplayer Summer, we’ll also be releasing new best practices guides and how-tos.Today, you can check out a brand-new VR Multiplayer Template for the Meta Quest and other OpenXR devices. This template lets you create immersive multiplayer games without the hassle of building and maintaining foundational systems from scratch. It covers everything from networked interactions to integration of voice communications to creating networked avatars and more. Download the VR Multiplayer Template for Unity 2022 LTS and Unity 6 Preview from the Unity Hub, and check out the quickstart guide for more information.Finally, we invite you to read up on other creators to get inspiration, including Fika Productions, who launched their peer-to-peer game Ship of Fools, and StickyLock’s Histera, which leveraged Netcode for Entities as well game server hosting, LiveOps services, and more.Join us at Unite in Barcelona from September 18-20 to get the latest multiplayer news, up level your skills, and celebrate other multiplayer game creators.

>access_file_
755|blog.unity.com

New ways of applying global illumination to your worlds in Unity 6

We’re thrilled to share more details about the new lighting features coming to Unity 6 later this year. With the new and robust light baking architecture and the innovative approach to authoring light-probe lit environments using Adaptive Probe Volumes (APV), you'll enjoy a more streamlined light creation process. This will significantly enhance your visuals while ensuring high performance at runtime. If you’ve worked with precomputed lighting data before, you’ll know how tedious the process can be. The precomputing process for Lightmaps can take a long time; Lightmap UVs need to be authored, Probes need to be placed for dynamic objects to be lit correctly, and you’ll need to deal with large textures that can place a heavy burden on your applications’ runtime memory.In Unity 6 we’ve added a new way for you to author higher quality, light-probe lit environments through Adaptive Probe Volumes (APV), and delivered foundational improvements to the light baking backend for greater stability.An Adaptive Probe Volume is a group of Light Probes that Unity places automatically based on the geometry density in your Scene, to build baked indirect lighting.Due to its adaptive nature, APV will generate more densely placed probes in areas with more geometry, and fewer probes in areas with less densely placed objects, like the background of a Scene.Adaptive Probe Volumes also provide you with a complete suite of powerful features for authoring beautifully lit environments.Delivers simpler workflows for probe placement workflows and faster iteration for light-probe-based indirect diffuse lighting.APV per-pixel lighting offers significantly higher quality than Light Probe Groups and provides better directionality compared to Lightmaps, resulting in excellent overall lighting quality.Seamlessly integrates with atmospherics, making effects like Volumetric Fog in HDRP and VFX Graph particles in URP and HDRP beautifully lit by indirect lighting.Enables visually stunning lighting transition through Sky Occlusion and Lighting Scenarios, suitable for achieving time-of day and lights on/off situations.Provides more control over optimizations for runtime performance, based on your use of render pipeline and target hardware.Runs a suite of streaming features, enabling light probe data to be streamed from Disk to CPU, and from the CPU to the GPU.Provides a powerful toolset for reducing light leaking.The URP 3D Sample project currently uses the latest 2022 LTS features. For demonstration purposes in this blog post, we've upgraded the URP 3D Sample scenes from 2022 LTS to Unity 6 Preview and the Adaptive Probe Volumes feature. APV is a volume-based system that automates the placement of probes rather than placing them by hand.The general settings tab for APV lets you control parameters like Min and Max Probe Spacing to drive the creation of multiple subdivision levels based on the surrounding geometry. By default, dense areas will use the highest resolution, while areas with less geometry will use lower density levels. This automatic and adaptive behavior ensures efficient resource allocation, focusing on areas where they are most needed.To automatically generate probes, you can create an Adaptive Probe Volume. While you’re working, you can see live updates, allowing you to preview probe placement without baking. These updates are based on bricks and the subdivision levels you previously defined, which then adjusts according to the proximity of nearby geometry.Generate Lighting precomputes all lighting data, including light probes, which you can visualize in your scene. As previewed using bricks, you can see the various subdivision levels that have been applied when placing probes.If you have worked with light probe data, you may be aware of the common challenges with light leaking. When developing APV, we added a whole toolbox to help address light leaking issues, like Virtual Offset, Dilation, Probe Adjustment Volumes, Rendering Layers and Light Leaking Prevention Modes “Performance” and “Quality”.Here’s an example. Using lighting debug views, we can observe a problematic use case for light leaking. In this situation, the bright light from the exterior is visible through the walls and the ground of the building. Outside sees the opposite problem, with dark lighting leaking from the interior. This is likely due to the low resolution (1 meter between probes) and the thin walls. Let's explore how we can address this.To investigate this issue, the Debug Probe Sampling option allows you to display each of the sampled probes along with their relevant weights. In our case, we can see that the result is interpolated between the bright probes from the outside and the dark ones from the inside. Ideally, the interior of the tent should only sample the interior probes.Rendering Layers for APV (landed in 6000.1f.1) allows you to create up to four different masks and restrict sampling to those specific masks for certain objects. This can be incredibly useful to prevent interior objects from sampling exterior probes, or vice versa.When generating lighting, the system will automatically assign layers to the probes during the bake process based on the nearby objects, eliminating the need to manually assign layers per probe. Once this is done, you can Generate Lighting and observe that leaking is reduced for the tent, thanks to manually creating separate interior and exterior masks.For even more control over light leaking prevention, you can leverage Unity’s Leak Reduction Modes “Performance” and “Quality.”Performance Mode addresses leak reduction by shifting the sampling location away from invalid probes. This generally works well in straightforward scenarios, where a suitable sampling location for all valid probes can be identified, while sidestepping any invalid ones. However, depending on the probe configuration, such an optimal sampling location may not be available. This then results in sampling of invalid probes and potential leaks.Quality Mode (landed in 6000.0.3f1) now enabled by default, employs up to three sampling attempts to help ensure that only valid probes are utilized. This mode may introduce a slight overhead on runtime performance, which can be especially noticeable on lower-end platforms.You can combine Leak Reduction and Rendering Layers to prevent light leaking even further. This mode helps ensure that invalid probes, whether due to Validity issues or being on a different Layer, are not sampled.Additionally, we've improved the multiple subdivision levels by reducing potentially visible seams between different levels (landed in 6000.0.4f1). This is achieved automatically by replacing the values of frontier probes located between two levels with pre-interpolated values. Since this process occurs at bake time, there is no performance cost associated with it at runtime.With APV you can achieve visually stunning lighting transition through Sky Occlusion and Lighting Scenarios, suitable for achieving time-of-day and lights on/off situations.Next you’ll find two examples of lighting transitions, first through Lighting Scenarios with APV in the URP 3D Sample project’s Oasis scene, then through Sky Occlusion with APV in the Garden scene. APV facilitates various lighting scenarios by enabling switching or blending between baked lighting data. This feature is particularly useful for simulating times of day or toggling between lights being on and off within the same scene or Baking Set.Lighting scenarios only manage baked APV light probe data; other elements need to be handled manually. To provide an example in the Oasis scene, we created a script to update the sky, lights, fog parameters, and reflection probes. APV baked scenarios can be managed at runtime using the ProbeReferenceVolume API, an example can be found in the documentation.Sky occlusion offers an alternative to lighting scenarios for managing lighting transitions in the scene. It involves a simpler setup with just a single bake, where no multiple scenarios are required. Instead, sky occlusion exclusively handles sky lighting and therefore does not extend to managing indirect lighting for directional or punctual lights.Sky occlusion uses additional baked data to manage sky lighting differently, which is separate from standard APV sky baking. This data stores the amount of sky light each area of the scene should receive, allowing for runtime adjustments to sky lighting color and intensity. By utilizing a dynamic ambient probe at runtime alongside this baked and static occlusion data, it provides a good approximation of sky lighting, while enabling dynamic adjustments to scene lighting.Sky occlusion is supported in both URP and HDRP. In HDRP, the ambient probe is updated automatically from the HDRP Physical Sky. In URP, however, when using the Skybox mode, the ambient probe cannot automatically update in real-time as the sky changes. Instead, this requires manually animating the color using the Gradient or Color mode to match the animated sky visuals, as Unity won't automatically adjust to the changing sky color.Using the Garden scene as an example, the Gradient Mode in the Environment settings enables manual animation of the ambient probe color. When paired with occlusion data, this setup can create a compelling approximation for animating the sky diffuse lighting, suitable for depicting changing times of day. This utilizes a single bake without multiple lighting scenarios, and can provide a wide range of color variations. Find out more in our documentation APV implementation in the Universal Render Pipeline (URP), and for APV in the High Definition Render Pipeline (HDRP).With Unity’s New Light Baking Architecture delivered in Unity 6, the GPU Light Baker is now out of previewThe new light baker is built with Editor responsiveness and baking speed in mind. This means that when using on-demand baking, Unity now takes a “snapshot” of the Scene state when the Generate button is clicked. Unity no longer checks the Scene state every frame, which previously undermined Editor performance.This redesigned baking backend has significantly simplified our code base, making it easier and faster to fix bugs, and lowering the risk of introducing new ones.We’re also providing you with a new baking profile that allows you to choose your appropriate workflow intent.You can choose a range from “lowest memory usage” - ideal if you want to continue working in the Editor and want the best overall Editor responsiveness - to “highest performance” which is useful if you want to get the job done as soon as possible and don’t need to work on anything else in the Editor during baking time.Iteratively authoring and troubleshooting baked lighting data is an important use case for creators using baked Global Illumination (GI).For this reason, we have added a new interactive preview functionality to various GI-related Scene View Draw Modes, replacing Unity’s Auto Generate with a dedicated Interactive GI Debug Preview Mode for previewing lighting data.This allows debug views to be updated interactively as the Scene is modified. The preview is non-destructive, since it does not replace baked lighting data.Moving away from Unity's old Auto-Generate architecture means that we can optimize the baking pipeline for greater stability.Note that Unity 6 is the last supported release for Enlighten Realtime GI. You can find more details on our previously communicated deprecation path in the Update on Global Illumination 2021 forum post.Here are links to previous requests for feedback, for the 2023.1, 2023.2 and Unity 6 (2023.3) Beta releases respectively:Global Illumination changes with the 2023.1 beta releaseGlobal Illumination changes with the 2023.2 beta releaseGlobal Illumination changes with the Unity 6 (2023.3) Beta releaseWe look forward to seeing your creations leveraging our new lighting features delivered in Unity 6. Please send us feedback in the Global Illumination forum or Unity’s new Discussions space!

>access_file_
757|blog.unity.com

New Shader Graph Production Ready Shaders in Unity 6

The Shader Graph team is excited to announce the release of our newest set of samples, available to import now in 2022 LTS and the upcoming Unity 6 release. This set of samples contains more than 25 Shader Graph assets and dozens of subgraphs that are ready to be used directly in your projects. The sample shaders work in both HDRP and URP.We have two main objectives with this sample set:Give our users a jump start in shader creation by providing a set of shaders that are ready to use.Provide examples that users can build on or modify to suit their needs.This sample set will help you achieve the shader results you want more quickly without starting from scratch.We also include a step-by-step tutorial that shows how to combine the assets to create realistic environments. With the tutorial you can see how the shaders work together in context.Here’s a breakdown of the content available in the Production Ready Shaders pack:Both URP and HDRP come with code-based shaders. The most commonly used shader for each of the SRPs is called Lit. For projects that use it, it’s often applied to just about every mesh in the game. Both the HDRP and URP versions of the Lit shader are full-featured. However, sometimes users want to add additional features to achieve a specific look, or remove unused features to optimize performance. For users who aren’t familiar with shader code, this can be very difficult.For that reason, we’ve included Shader Graph versions of the Lit shader for both URP and HDRP in this sample pack. Users can make a copy of the appropriate Shader Graph Lit shader, then change any material that’s currently referencing the code version of the Lit shader to the Shader Graph version. All material settings will correctly be applied and continue to work. They’ll then be able to make changes to the Shader Graph version as needed.Decals allow you to apply local material modifications to specific locations in your scene. Think of things like applying graffiti tags to a wall or scattering fallen leaves below a tree. But decals can be used for a lot more. In these examples we see decals making things look wet, making surfaces appear to have flowing water across them, projecting water caustics, and blending specific materials onto other objects.This set of shaders is made for meshes applied to terrain – such as grass, weeds, undergrowth, pebbles, etc. To learn more, read the terrain documentation on details. Detail meshes have some specific requirements for shaders. First, because of the high number of these meshes used on the terrain, their shaders must be as fast and efficient as possible. That mainly means keeping the number of texture samples low and doing more work in the vertex shader instead of the pixel shader. Second, because these meshes stop rendering and pop out at a specific distance, we use a method to dissolve them to prevent a harsh pop, making it less obvious that they’re being removed. In each shader, you’ll see a Distance Mask used to dissolve the mesh at a distance before the mesh stops rendering.This is a full-featured, modular rock shader that can be used for everything from small pebbles, to boulders, to large cliff faces. It has features that can be turned on and off depending on the application. Each is encapsulated in a subgraph so it’s easy to remove unneeded features. You can also add new features in the chain of modules.The sample set comes with four different water shaders – lake water, animated pond water, stream water, and stream waterfall. Each one uses reflection, refraction, surface ripples using scrolling normal maps, and depth fog. They also provide additional features unique to each water type.This sample comes with a full set of weather-related subgraphs (rain and snow) that can be mixed and matched depending on the requirements of the object type. Rain effects include rain drops on top of objects, rain drips trickling down the sides, and puddles that can dynamically accumulate on flat surfaces, including rain and wind ripples.The sample set also includes a step-by-step tutorial showing how to combine the water shaders, decals, rocks, and terrain detail meshes, along with several other Unity features to create a forest stream environment. The tutorial shows how everything is put together, and how the sample pack assets can be used together to create an environment.Install the new sample assets using the Package Manager.In the Editor, open Package Manager.2. In the Package Manager window, select the Shader Graph package.3. Select the Samples tab.4. Finally, select the Production Ready Import button to bring the new Production Ready sample set into your project. With these steps completed, the node reference assets will show up in your project under Assets/Samples/Shader Graph//Production Ready Shaders.After importing the samples, get started by opening the scene that corresponds to the render pipeline you’re using (High Definition Render Pipeline,(HDRP), or Universal Render Pipeline (URP):Assets/Samples/Shader Graph//Production Ready Shaders/Scenes/URPProductionReadyShadersAssets/Samples/Shader Graph//Production Ready Shaders/Scenes/HDRPProductionReadyShadersOnce the scene is open, select the Shader Graph Feature Samples Showcase asset at the top of the Hierarchy panel, then follow the guided tour in the Inspector.You can use the Samples dropdown box to select a sample and jump to that location in the scene.Unity continues to add more samples to Shader Graph, with several more sample packs coming in the months ahead. These will help you learn Shader Graph more quickly, understand how to set up specific functionality, and create new shaders faster with premade subgraphs and templates. We hope you’ll enjoy using them.Shader Graph basicsShader Graph documentationUnity Learn tutorialsThis is a deep and rich sample set. We hope you have fun exploring it and use it to speed up your own shader creation process.We’d love to hear your thoughts and impressions on these samples – tell us what you think in the Shader Graph forum.

>access_file_
759|blog.unity.com

Games Made with Unity: June 2024 in review

Well, that was a wild month. From announcements to demos to new releases, there’s much to dive into for games made with Unity in June.Non-E3June is always one of the biggest times in gaming with plenty of non-E3/Summer Gamefest showcases, #pitchYaGame, and other shows like Games to Get Excited About Fest or Best Indie Games’ Future Game Show.Sony’s State of Play showcased a new trailer for Projekt Z: Beyond Order to kick off a slew of announcements we’re excited about, with some of the other shows hyping up games like Abyss X Zero, Screenbound, Phoenix Springs, Project DOSA, Building Relationships, and a release date reveal for Anger Foot in Devolver Direct. Towards the end of the month, Morbid Metal and FEROCIOUS debuted new trailers in the IGN “Games Baked In Germany” Show.Steam NextFestSteam NextFest ran from June 10–17, with many of your games featured in the top and trending. Some of our favorites included Blue Prince, The Explorator, ASKA, REKA, SWORN, Metal Slug Tactics, Tactical Breach Wizards, Mirthwood, Aloft, and Fata Deum.A personal highlight was watching Xalavier Nelson Jr. and Strange Scaffold’s new speed run, secret agent, gun-fu(?), FPS game – I Am Your Beast – pop off during the NextFest as people on Twitter competed for the lowest possible time.However, this list is about games released this month – and even with a stacked wishlist, there are plenty of great ones you’ve put out into the world that anyone can play right now.Speaking of wishlists, though, we’re pleased to announce a new official Unity Steam Curator page put out by our Made with Unity team. Follow this list for a running catalog of great titles you’ve made with the Unity engine, including some great titles from NextFest or the various showcases.Working on a game in Unity? We’d love to help you spread the word. Be sure to submit your project.Without further ado, to the best of our abilities, here’s a non-exhaustive list of games made with Unity and launched in June of 2024, either into early access or full release. Add to the list by sharing any that you think we missed.Action and Comedy#BLUD, Exit 73 Studios (June 18)Fireside, Emergo Entertainment (June 4)Unlanded, Eki-Eki-Eki (June 7)Perfect World, Michael Overton Brown (June 13)They Can Fart, Les Crafteurs (June 18)Astrodle, Robin Nicolet (June 19)Frogun Encore, Molegato (June 25)City BuilderDystopika, Voids Within (June 21)El Dorado: The Golden City Builder, Hobo Bunch, Gameparic (June 17)Go-Go Town!, Prideful Sloth (June 18 – early access)Nekokami - The Human Restoration Project, Rocket-in-Bottle (June 25 – early access)FPSFallen Aces, Trey Powell, Jason Bond (June 14 – early access)Histera, StickyLock Games (June 20 – early access)Narrative and mysteryTavern Talk, Gentle Troll Entertainment (June 20)Ghost Boy, Two Blackbirds (June 25)Roguelike/liteGUNCHO, Arnold Rauers, Terri Vellmann, Sam Webster (June 25)Rune Gate, Devwind (June 6)Dragon is Dead, TeamSuneat (June 7 – early access)Into the Emberlands, Tiny Roar (June 19 – early access)Dice & Fold, Tinymice Entertainment (June 24)Dragon Eclipse, Fardust (June 24 – early access)SimulationSandwalkers, Goblinz Studio (June 19 - early access)Rolling Hills: Make Sushi, Make Friends, Catch & Release, LLC (June 4)Everafter Falls, SquareHusky (June 20)StrategyCrab God, Chaos Theory Games (June 20) clickyland, Sokpop Collective (June 3)Songs of Silence, Chimera Entertainment (June 4 - early access)Emberward, ReficGames (June 25 - early access)SurvivalASKA, Sand Sailor Studio (June 20 – early access)That’s a wrap for June 2024. Want more Made with Unity and community news as it happens? Don’t forget to follow us on social media: Twitter, Facebook, LinkedIn, Instagram, YouTube, or Twitch.

>access_file_
760|blog.unity.com

New technical e-book for worldbuilding in XR with Unity available for free

Embark on journeys through immersive virtual realms, teleport between dimensions, or merge digital marvels with the real world – the possibilities of virtual reality (VR) and mixed reality (MR) invite creators to bring their imagination to life.Our latest comprehensive guide will help both aspiring creators and seasoned developers to delve into, and understand, the intricacies of building VR and MR experiences (or collectively referred to as “XR”) using Unity.Written by seasoned 3D artists and Unity developers, this e-book covers the tools, methodologies, and techniques essential for crafting immersive and interactive realities. From constructing environments to implementing intuitive interactions, you’ll get the tips and guidance you need to bring your VR and MR applications to life.Let’s have a look at some of the topics that you’ll find in this guide.1. Worldbuilding in XRWorldbuilding is not just about creating a visually stunning environment. It’s about crafting an immersive world that tells a story, conveys emotion, and captivates users. This section looks at the building blocks for worldbuilding in XR, such as:Designing with purpose: Every element of your story should serve a purpose. Avoid elements that could clutter your environment.Implementing visual and audio cues: Use these cues to guide users through the experience.Adding interactive elements: Incorporate elements that reveal more about the world your users are in.Immersing through detail: Pay attention to detail in the environment to create a cohesive and consistent experience.Encouraging exploration: Reward users for exploring the world.Adaptive difficulty: Adjust challenges based on user performance.Social interaction: If possible, integrate multiplayer elements where users can collaborate to solve, for example, a puzzle together.Iterative design: Use feedback to iteratively refine your world.2. Asset creation for XRWhether you’re designing small props or main characters, make sure to follow the tips in this chapter about creating assets in XR to save yourself some time later in the development cycle.These tips include:Concept and design3D modelingCreating a pivot pointTexturing and trim sheetsRigging and animationModel optimization and exporting3. Creating a new VR project in UnityThis section presents some of the main steps in developing a VR project in Unity. It covers the decision-making driving each development phase, as well as practical steps and workflows in Unity. We look at the rendering pipelines and XR tech stack with XR Interaction toolkit, AR Foundation, OpenXR, platform features, and sample projects.If you want to also follow the process of starting a new project step by step and running it on the Meta Quest hardware you can watch this video.4. Mixed reality and spatial computing applicationsMixed reality (MR) applications integrate the real world and a virtual one in a hybrid landscape where the blending of these elements should feel seamless to the user. MR development combines aspects of VR and AR worldbuilding like:User interaction and interface designSpatial awareness and physicsCross-platform development strategiesEnvironmental design and immersionThis section also includes a breakdown of the setting up the MR template, which is designed to serve as a starting point for MR development in Unity.5. Spatial computing with Apple Vision ProIn this section of the guide, we cover how to get started with visionOS in Unity, how interactivity works in Unity projects for Apple Vision Pro, as well as tips and techniques for porting immersive VR projects and MR apps to visionOS.In the e-book you’ll learn about the different XR experiences for Apple Vision Pro:More resourcesDownload the XR guide today and find all of our e-books for artists, technical artists, and designers in the Unity best practices hub or the Unity Manual.You can connect with the Unity community to get insights and tips on XR development in Unity’s Discussion space for visionOS development. For information on upcoming features, see our roadmap.

>access_file_