// transmission.log

Data Feed

> Intercepted signals from across the network — tech, engineering, and dispatches from the void.

1695 transmissions indexed — page 54 of 85

[ 2023 ]

4 entries
1061|blog.unity.com

On the road: My year of conferences on Unity’s Diversity Recruiting team

My name is Kaylynn, and I joined Unity as a member of the Diversity Recruiting team in 2022 as a project coordinator. My first priority was to head our conference engagements for the year. While I was initially nervous to lead such an important project, I am very happy to have done so alongside an amazing team, and am excited to share a behind-the-scenes recap of our recruiting travels.To kick off our first in-person conference of the year, members of the University Recruiting team and I headed to Washington, D.C., in September for the Center for Minorities and People with Disabilities in Information Technology’s Richard Tapia Celebration of Diversity in Computing Conference. The Tapia Conference was a great experience and held true to its 2022 theme, “A Time to Celebrate!” Attendees included undergraduate and graduate students, faculty, and professionals in computing from all backgrounds and ethnicities. This was my first in-person recruiting event coming out of the pandemic and my first with Unity overall. I can still remember experiencing jitters before we opened our booth on the first day, but the University team brought me up to speed fast, which boosted my confidence in speaking to students about our internship programs.Looking back at my time at Tapia, my favorite memories come from the engagement opportunities – connecting with others over common backgrounds, ethnicities, disabilities, and genders. Our team really did a great job in fostering the opportunity to develop relationships with candidates that extended beyond the conference itself.At the end of September, I joined several other Unity representatives in Orlando, Florida, for the Grace Hopper Celebration (GHC). While I have been to many recruiting events before, I had never been to a conference as big as this one. In 2021, GHC virtually hosted 29,000 people, and holds the record for the world’s largest gathering of women in computing.During our time at the conference, our recruiters had a lot of early mornings and late nights but, nonetheless, it was always fun. We usually started our day at 8:00 am sharp, when we could be spotted running to the shuttle stop to make it to the convention center on time.During the day, attendees could find the smiling and welcoming faces of our team at the booth, talking to potential candidates about the business and open opportunities. The career fair closed each evening at 5:00 pm; however, because we had so many candidates who wanted to speak with us, we would usually stay for an extra 30–45 minutes. Other conference events included participating in onsite internship interviews led by our University team or volunteers from across Unity. Having volunteers available allowed us to carry out all interviews and, later, make some internship offers.Although GHC was full of long days and nights, it was one of my favorite conference engagements because of the talent and drive evident in the room. It was rewarding to connect with so many candidates who were really excited to talk about Unity and eager to show us the projects they have built or are working on with our software.A second reason I enjoyed GHC was due to it being my first chance to meet all my wonderful teammates in person. These days, since we primarily meet and chat virtually, it was nice to see each other in person. The conference gave us the chance to spend five days working together and getting to know one another.In November, my team, our inclusion partners, and ComUnidad – the Latinx Employee Resource Group (ERG) – members packed our bags and headed to Charlotte, North Carolina, for the annual get-together of the Society of Hispanic Professional Engineers (SHPE). At the SHPE 2022 National Convention, we met hundreds of candidates from the Latinx community and created relationships with these prospective candidates for upcoming programs.With more than 13,000 students and professionals attending the event, we spoke to many aspiring engineers from different specialties. At the time, I was unaware that there were so many different types of engineering. Other than meeting so many talented engineers, networking, and giving out cool swag at the booth, our team also participated in onsite interviews, which – much like the other events – allowed us to speak to many candidates and, ultimately, make some internship offers.During my time at the booth, candidates shared their interests, projects, and career goals, and it was exciting to hear stories about how their persistence and hard work got them to where they are today. Seeing undergraduate and graduate students juggling finding internships at conferences on top of their busy school and home lives made me feel very proud of all those attending.A week after we wrapped up at SHPE, we headed to Austin, Texas, for Blavity Inc.’s AFROTECH Conference. AFROTECH is known for being the place for all things Black in tech and Web3, and was quite different from our other in-person engagements in 2022. This was because we were not only there to recruit, but we also participated in two additional conference activations: hosting a learning lab and an event in our Austin office.On the Monday, we started with a career fair, which included several team members from Recruiting and B-United – the Black ERG – helping out at the booth. We spent three days speaking to both internship candidates and advanced professionals and tech entrepreneurs.On the second day, Unity hosted a learning lab that showcased our real-time 3D capabilities. As part of the lab, an expert panel featured Unity’s Raymond Graham, Krystal Cooper, and Nick Straughn. Over 300 attendees packed the workshop to engage with our experts and learn more about the panelists’ respective career journeys.On the final day, we wrapped up our AFROTECH activations with an in-office Unity @ AFROTECH event. Unity for Humanity grantee Black Terminus AR joined us, and its founder Damien McDuffie wowed the crowd with an interactive VR exhibit. The in-office event was one of my favorite parts of this conference because it gave such a personal touch to the week. As a company, we were able to connect with candidates and creators on a more intimate level in our cool workspace.Throughout 2022, our team attended and represented Unity at seven virtual or in-person diversity recruiting engagements. In addition to the conferences I shared above, we participated in the National Society of Black Engineers’ annual convention, Latinx in Gaming’s CONEXION, and three QueerTech activations.During my time as a coordinator on some of these projects, I traveled to places I’d never been before, worked with amazing team members, and met candidates with great passion for technology and Unity.While our diversity recruiting approach at Unity includes many additional strategies to build relationships with candidates traditionally underrepresented in tech, I really enjoyed being able to create sustainable relationships with candidates face-to-face during my first year on the team.As we know, it is hard to be what you can’t see. At Unity, this means that showing up to events like these, with representation as a focus, helps us connect with candidates we might never have had the chance to get to know. It was an honor to be a part of these efforts in 2022, and, from a diversity, equity, and inclusion (DEI) perspective, it is great to know that, with each conference we attend and each person we meet, we inch one step closer to connecting underrepresented communities to the world of tech and to Unity.To learn more about Unity’s DEI strategies and initiatives, visit our Inclusion and Diversity page. To explore open roles, check out the Unity Careers site.

>access_file_
1063|blog.unity.com

The what, why, and how of using nano influencers for performance marketing

There’s a common misconception that influencer marketing works best for brand awareness campaigns - but according to Ryan Silberman, CEO at Webfluential, it’s increasingly being used to drive performance. At Appfest 2022, Ryan explained how you can leverage influencer marketing to increase installs. Read the summary or watch the video below.The original ad recipeAccording to Ryan, creating engaging ads has historically been built on using two basic ingredients: context and personalization.Advertisers first use what they learn in different environments (context clues) to offer relevant information to consumers (personalized content). For example, let’s say a woman comes into a store soaked from the rain. When the store keeper offers her an umbrella, he’s using context clues (the woman soaked from the rain) to give her an item.Leveraging trust: Nano influencersHowever, even when you implement context and personalization into your ads, there’s a key ingredient still missing in the mix: trust. By incorporating trust into your ads, you can generate higher consumer confidence in your product - that’s because when we trust someone, we’re more likely to believe what they say.Consumers are more likely to trust people who feel accessible and relatable. For example, someone touring a new city likely trusts a local’s recommendation over a celebrity that’s getting paid to give those recommendations. Even though a celebrity is more well known and has a higher reach, a local would seem more credible. Another word for a person with a smaller reach, but a higher engagement rate? Introducing nano influencers.Nano influencers are influencers that have fewer than 3,000 followers. The graph below illustrates how leveraging nano influencers shows huge promise: the smaller someone’s audience size, the higher their engagement rate is.Improving performance at scaleEngagement rate is one thing, but can it translate to actual transactions like sales and installs?Looking to garner more sales, Elysium Health, an American health company that sells vitamins, partnered with nano influencers who were passionate about cycling, to drive traffic that resulted in sales. They used that content at scale by running ads on Facebook and Tiktok and managed to improve the CPA by 2x.Beyond sales, nano influencers can also improve sentiment towards your app. That’s because you’re creating an authentic narrative with your audience, while also showing your product in a positive light. In fact, Chef, a restaurant tycoon game, teamed up with nano influencers to create sponsored content that showed a 10x uplift in sentiment.Referrals with resultsIn addition to traditional social channels like Instagram and TikTok, here’s another way nano influencers can share your ads - dark social.Dark social includes more intimate channels like WhatsApp or email, in which nano influencers can simply copy and paste a link and share it directly with friends - producing extremely high conversion and engagement rates. In fact, Ryan says 84% of sharing is in dark social. So while it’s not tracked or spoken about, it’s a huge opportunity to leverage with nano influencers.Tips for influencers becoming a performance channelRyan gives us the 4 key pillars that make a successful influencer program.1. Technology platformHaving the right platform saves time and money for recruiting and contracting the right influencers. An added benefit is that it also manages your workflow.2. InventorySome people may think you need a lot of influencers on stand-by but according to Ryan, that’s actually not the case. What’s important is that your influencers are opt-in, meaning that they’ve linked their social channels, have agreed on contractual terms, and are fully ready to work - getting you a faster campaign turnaround.3. Strategy (on demand)You’ve got your technology and the right influencers, and now it’s time to decide what the influencers should do. Ryan’s suggestion? Bring your influencers into your world and ask them what the strategy should be, because they know their audience best.4. Execution + paidEven if you hit the other 3 pillars perfectly, Ryan says this is the most important pillar to ensure success. If you can’t execute at scale then it means nothing and organic isn't big enough. You need to integrate into ad networks and social platforms to scale up.Wrapping upIf there’s one thing to remember with influencer marketing, Ryan encourages you to remember these key points.Apply trust, context, and personalization to engage users Use low cost nano influencers as opposed to high cost celebrities or branded contentFigure out how to leverage dark social and use your ad networks through Paid Retention is built into the community - use this to your advantageWatch the recording here: https://youtu.be/-lN_RX4gj8c

>access_file_

[ 2022 ]

16 entries
1065|blog.unity.com

2022 on-device mobile advertising overview and 2023 trends

2022 on-device mobile advertising overview and 2023 trendsWith new app sectors, device models, and marketing techniques shaping 2022, it’s crucial to stay up to date on the next big opportunities in on-device advertising. As 2023 begins, let’s explore what worked this past year and what will continue to be important.The smartphone industry in numbersLet’s first look at the state of the mobile ecosystem in 2022.- Worldwide smartphone shipments in 2022 were around 1.2 billion (1)- Samsung held 22% of the market share in 2022, maintaining the No. 1 position (1)- Total shipments of 5G-enabled smartphones were expected to reach 650 million units by the end of 2022 (2) - Android maintained its position as the leading mobile operating system worldwide in the fourth quarter of 2022 with a close to 71.8% share (3)- Trends in the market- The amount of devices sold and the technology they’re equipped with has a huge impact on what is possible with on-device advertising.Incorporating CTV into your on-device advertising strategyCTV, connected TV advertising, is expected to receive huge advertising investments in 2023, growing at a rate of 27% according to MarketingDive, making it a great time to consider how you can incorporate CTV advertising into your on-device advertising strategy.First, with a seemingly unlimited number of streaming services available today, it’s becoming harder for users to keep up with paid subscriptions. To create a more sustainable revenue stream, many platforms are working on offering less expensive plans with an ad-supported option.Second, as it stands, 95% of TV ad spend comes from Fortune 500 companies focused on branding, but these companies only represent 33% of US business revenue. With TV bigger than any other social network, the channel needs to be accessible to all kinds of advertisers. CTV advertising, a performance marketing channel (WiFi has a static IP, which allows advertisers to link every device in a household), makes that possible all while delivering 900K impressions per second in the US alone.On top of that, CTV can work in conjunction with your existing on-device advertising strategy, when the app is already on the phone, to increase engagement and retention. For example, CTV advertising can be used as a form of push notifications to encourage users to open your app for the first time or recommend a new way to engage with the app’s features. As the technology continues to evolve, more opportunities will unfold, such as opening the app directly from the CTV ad to start interacting immediately. With CTV advertising, your on-device advertising efforts could have an even greater impact.Listen to our podcast with Vibe, an all-in-one TV ad platform.Using AI to create more efficient on-device advertising experiencesIn the current macroeconomic climate, it’s even more critical to find ways to stand out on channels you’re already using, like on-device advertising. How? Utilizing innovative technology such as AI to make your efficient channels more efficient.AI will create better advertisements by writing headlines and creating ads - you can use generative AI to create text and images for your on-device advertising notifications and full-screen offer campaigns. AI can also enable predictive analytics - discovering insights, making predictions, and unifying your data - when it comes to on-device advertising, these tools can help you determine which audiences you should be reaching and how that audience is engaging on their devices.Most interestingly, however, is the opportunity to improve interactivity with users through creatives. For instance, a full screen on-device ad for an eCommerce app could allow users to try on various items to see which looks best. A full screen ad for a food delivery app could allow you to take a virtual tour from a nearby restaurant to your home to get a feel for how fast delivery can be.Today, AI is well equipped to help you drive productivity and creativity across your media mix.Growing smartphone brandsSamsungAs of January 2023, Samsung has a market cap of $335.28 billion according to Companies Market Cap. On top of that, Samsung’s brand value was evaluated at $87.7 billion in 2022, which represents 17% growth compared to $74.6 billion in 2021, ranking the company in 5th for three years in a row on Interbrand's Best Global Brands of 2022.Samsung’s Galaxy Z series, where their foldable phones live, are the market leaders in the foldable space. In fact, from January to October 2022, the number of foldable smartphones Samsung contracted to enterprise customers increased by 105% compared to the same period in 2021 according to Samsung. At CES 2023, Samsung announced a new technology for foldable phones, Flex Hybrid.Samsung also recently unveiled the Galaxy S23 series. The main upgrade on these devices is the 200-megapixel main camera, along with the use of the Snapdragon 8 Gen 2 chip, which will allow for better AI performance and power efficiency.On top of that, while many large chip manufacturers have begun scaling back their chip production due to external concerns, Samsung will increase chip production in 2023, especially at its largest semiconductor plant in Pyeongtaek. This prepares them to take a big portion of market share as demand for chips returns to normal levels.Learn more about Samsung in 2023.GoogleWith the release of Google’s revolutionary Pixel 6, housing Google’s first Tensor chip, at the latter end of 2021, Google saw massive sales growth in the first half of 2022. According to Canalys, Google sales in North America were up 380% in Q1 2022 year over year and 280% in Q2 2022 year over year. For reference, leaders in market share only saw 1-4% annual growth.In 2022, Google iterated on the Pixel 6 with the Pixel 7, which is powered by the next-generation Google Tensor G2 processor. First of all, Pixel 7 is “sleek, sophisticated and durable” and the regular size is more compact than Pixel 6. The Pixel 7 is also being shipped with Android 13, the best Android experience yet.So, what’s expected in 2023? According to rumors, it might be the best year for Pixels yet, marked by the company carving its own way rather than following others. One new innovation in 2023 includes the Pixel fold, otherwise known as the Pixel notepad. With the release of this device, Google will have beaten Apple to the foldable space. Google also plans to release the Pixel Tablet, which means they’re beginning to challenge the Apple dominant tablet market. Along with new innovations that push the envelope, Google will also improve on their already high-performing devices. We can expect Google to release the Pixel 7a, Pixel 8, and 8 Pro throughout 2023. These are phones that will continue to appeal to the masses and keep Google in the spotlight.Top 5 app categories using on-device advertising in 2023It’s also important to have an idea of which app categories are on the rise, so you can learn how and why advertisers are finding success through this channel.1. Brands with apps are investing in on-device advertisingAs marketing teams face budget cuts, brands are rethinking their spend and focusing on channels where consumers are most active - mobile.In fact, according to Oberlo, mobile advertising spend in the US in 2023 is expected to reach $355.1 billion, surpassing $300 billion for the first time. Ultimately, now that mobile has a larger audience, higher usage levels, and is more reputable, big brands are beginning to look beyond traditional channels, to on-device advertising, to get in front of new, quality users through their phone screens.McDonald’s Spain achieves 64% app launch rate and drives over 34K installs with Aura2. Food and rapid delivery apps are continuing to growCOVID-19 accelerated the shift to ordering in from delivery apps rather than eating out. Since then, many other factors have continued to drive the food delivery app industry forward - convenience, increased demand, safety, etc. In fact, the entire food delivery app industry is expected to reach $320 billion market size by 2029 according to Business of Apps. Ultimately, with more delivery riders and route optimizations technologies, delivery is faster and cheaper than ever, driving incremental growth despite diminishing COVID-19 protocols.Glovo increased Aura installs by 5x and decreased CAC by 60%Rapid delivery apps, or apps that deliver groceries within 10-30 minutes of app order, are also attracting millions of users and big name investors. Rapid delivery apps are largely operating out of major cities across the UK, US, and Europe, with a few first movers leading the market in these areas - Getir, Flink, Gorillas, Gopuff, Weezy, Dija, Jiffy, Fancy and Snappy are all notable names.Read more about rapid delivery apps3. Classified apps are allowing users to buy and sell wherever they areAs the economy faces uncertainty, people are spending time sifting through their belongings and making decisions about what to keep and sell. Now classified platforms, especially on mobile, represent a major opportunity for growth.eBay Kleinanzeigen (Classifieds) boosts installs with AuraSubito looked to Aura to boost engagement 4x, grow scale, and diversify their user acquisition strategyMilanuncios reduces cost per lead 47% with Aura4. News apps are users go-to to stay updated on current eventsThe share of US consumers reading online news on a smartphone more than doubled between 2013 and 2022 according to Statista. “News on the go” through mobile devices has become the norm. This means that news apps are no longer an add-on to people’s typical news source, but the main character - encouraging news apps to focus on their app discovery and user acquisition strategies. We’ve seen a similar trend in Europe as well over the last year.SmartNews drives 35 million installs and meets retention goals with AuraHow Le Figaro reduced costs 70% and reached retention goals in 3 months with Aura5. eCommerce apps are making it easier to shop from your phoneSimilarly to news apps, eCommerce apps have also seen immense growth, creating a more competitive environment - more than half of all internet traffic comes from a mobile device. This is most likely due to the effects of COVID-19, which turned people away from brick and mortar. In 2023, smartphone retail eCommerce sales are expected to pass $432 billion, up from $148 billion in 2018 according to Statista.OTTO sees over 100% MoM growth and significantly increased ROI for their ironSource Aura volumeAliExpress Russia drives over 400K installs and exceeds their KPI goals with ironSource AuraOzon drives over 900K installs and exceeds KPI goals with ironSource AuraOverall, the state of the smartphone industry, key trends in the Android market, and the growing app categories, all have had an impact on the success of ironSource Aura. With on-device advertising becoming more prevalent, Aura saw a record year in 2022.The growth of AuraIn 2022, the team at ironSource Aura focused on expanding the on-device channel’s available touchpoints so advertisers like you can continue to drive value and meet valuable users at scale. New placements - such as the Discovery Widget that offers new apps to users directly from their device’s +1 screen on a daily basis - use contextual information to reach users in the right ways and at the best times for maximum, long-term engagement.ironSource Aura also expanded to new carrier and telco partners in 2022 beyond our current partnerships with Samsung, Vodafone, Orange, and Boost - announcing our Samsung partnership in MENA.It’s been an incredibly successful year thanks to both our advertiser and telco partners. Looking forward to continued success in 2023!Full list of Aura’s on-device advertising content from 2022Case StudiesFugoVery/DentsuInspired SquareDish Retail Wireless engages its 8M customers with AuraBlogsLaunching a new report "5G: The consumer perspective"4 ways advertisers can optimize their 5G on-device advertising strategyThe mobile advertising ecosystem in SpainHow agencies can drive incremental growth through mobile on-device campaignsHow to leverage on-device advertising to create a frictionless install experience7 questions to ask yourself when evaluating a new user acquisition channelThe first 30 days of an on-device campaignLooking to boost your game’s LTV? Try an on-device advertising campaignNo guts, no glory: MAD//Fest 2022 with ironSource AuraThe 3 most important KPIs running an on-device acquisition campaignAn inside look at the Japanese mobile economyThe what, why and how of web-to-app acquisition campaignsArticlesHospitality Tech | Four Ways Hospitality Marketers Can Innovate in the Post-Pandemic Future via a Mobile AppRCR Wireless News | The new device experience: How mobile carriers successfully ‘get in’ on the app economyTotalRetail | How to Leverage Your App to Get More CustomersPodcastsAdam Hadi, Current | How to deeply integrate influencer marketing into your productJean-François Grang, Purchasely | How to succeed with a subscription monetization model for your appDave Edwards, Audiomack | Bridging the knowledge gap between music creators and the app economyAndre Kempe, Admiral Media | Back to the basics: How will performance marketing survive?Tobias Boerner, Fastic | The real people behind your app: community-based apps and the growth loopPeter Fodor, AppAgent | Storytelling and mobile apps: a tale as old as timeKaran Bhavnani, Tripledot | Growth trends: team structure and experimentation in a high growth environmentGreg Turtle, What3Words | Gaining a growth edge with media for equity dealsAdrienne Rice & Sarah Chafer, M&C Saatchi Performance | What consumers want from brands this holiday seasonThomas Petit | How to optimize your subscription app today to grow tomorrowResearchBack to school shopping trends for your app or brand in 2022Travel is back in-style: 6 trends advertisers should keep in mind in 20228 key findings to help you master your holiday advertising strategyWebinarBuilding your 3 pronged holiday UA strategy: Creatives, offerwall, and on-device advertisingReports5G: The Consumer PerspectiveOn-device advertising 101: A report by Singular and ironSource AuraThe 2022 modern mobile consumer: app discovery and monetizationThe mobile marketers’ guide to mastering the holiday seasonResourcesCanalys QualcommStatista

>access_file_
1066|blog.unity.com

Improvements to shader build times and memory usage in 2021 LTS

As Unity’s Scriptable Render Pipeline (SRP)’s available feature set continues to grow, so does the amount of shader variants being processed and compiled at build time. Alongside ongoing support for additional graphics APIs and an ever-growing selection of target platforms, the SRP’s improvements continue to expand.Shaders are compiled and cached after an initial (“clean”) build, thus accelerating further incremental (“warm”) builds. While clean builds usually take the longest, lengthy warm build times can be a common pain point during project development and iteration.To address this problem, Unity’s Shader Management team has been hard at work to provide meaningful and scalable solutions. This has resulted in significantly reduced shader build times and runtime memory usage for projects created using Unity 2021 LTS and later versions.To read more about these new optimizations, including affected versions, backports, and figures from our internal testing, skip directly to the sections covering shader variantprefiltering and dynamic shader loading. At the end of this blog post, we also address our future plans to further refine shader variant management as a whole – across project authoring, build, and runtime.Before delving into the exciting improvements made to Unity’s shader system, let’s also take the opportunity to quickly review the concepts of conditional shader compilation, shader variants, and shader variant stripping.Conditional shader features enable developers and artists to conveniently control and alter a shader’s functionality using scripts, material settings, as well as project and graphics settings. Such conditional features serve to simplify project authoring, allowing projects to efficiently scale by minimizing the number of shaders you’ll have to author and maintain.Conditional shader features can be implemented in different ways:Static (compile-time) branchingShader variants compilationDynamic (runtime) branchingWhile static branching avoids branching-related shader execution overhead at runtime, it’s evaluated and locked at compilation time and does not provide runtime control. Shader variant compilation, meanwhile, is a form of static branching that provides additional runtime control. This works by compiling a unique shader program (variant) for every possible combination of static branches, in order to maintain optimal GPU performance at runtime.Such variants are created by conditionally declaring and evaluating shader functionality through shader_feature and multi_compile shader keywords. The correct shader variants are loaded at run time based on active keywords and runtime settings. Declaring and evaluating additional shader keywords can lead to an increase in build time, file size, and runtime memory usage.At the same time, dynamic (uniform-based) branching entirely avoids the overhead of shader variant compilation, resulting in faster builds and both reduced file size and memory usage. This can bring forth smoother and faster iteration during development.On the other hand, dynamic branching can have a strong impact on shader execution performance based on the shader’s complexity and the target device. Asymmetric branches, where one side of the branch is much more complex than the other, can negatively impact performance. This is because the execution of a simpler path can still incur the performance penalties of the more complex path.When introducing conditional shader features in your own shaders, these approaches and trade-offs should be kept in mind. For more detailed information, see the shader conditionals, shader branching, and shader variants documentation.To mitigate the increase in shader processing and compilation time, shader variant stripping is utilized. It aims to exclude unnecessary shader variants from compilation based on factors such as:Materials included and keywords enabledProject and Render Pipeline settingsScriptable strippingWhen enumerating shader variants, the Editor will automatically filter out any keywords declared with shader_feature that are not enabled by materials referenced and included in the build. As a result, these keywords will not generate any additional variants.For example, if the Clear Coat material property is not enabled by any material using the Complex Lit URP Shader, all shader variants that implement the Clear Coat functionality will safely be stripped at build time.In the meantime, multi_compile keywords prompt developers and players to freely control the shader’s functionality at runtime based on available Player settings and scripts. The flip side is that such keywords cannot automatically be stripped by the Editor to the same degree as shader_feature keywords. That’s why they generally produce a larger number of variants.Scriptable stripping is a C# API that lets you exclude shader variants from compilation during build time via keywords and combinations not required at runtime. The render pipelines utilize scriptable stripping in order to strip unnecessary variants according to the project’s Render Pipeline settings and Quality Assets included in the build. Low quality High quality Variant multiplier Main Light/Cast Shadows: Off On 2x Main Light/Cast Shadows: On On 1x Main Light/Cast Shadows: Off Off 1xIn order to maximize the effects of the Editor’s shader variant stripping, we recommend disabling all graphics-related features and Render Pipeline settings not utilized at runtime. Please refer to the official documentation for more on shader variant stripping.Shader variant stripping greatly reduces the amount of compiled shader variants, based on factors like the Render Pipeline Quality Assets in the build. However, stripping is currently performed at the end of the shader processing stage. Simply enumerating all the possible variants can still take a long time, regardless of compilation.In order to reduce the shader variant processing (and project build) times, we are now introducing a significant optimization to the engine’s built-in shader variant stripping. With shader variant prefiltering, both clean and warm build times are significantly reduced.The optimization works by introducing the early exclusion of multi_compile keywords, according to Prefiltering Attributes driven by Render Pipeline settings. This decreases the amount of variants being enumerated for potential stripping and compilation, which in turn, reduces shader processing time – with warm build times reduced byup to 90% in the most drastic examples.Shader variant prefiltering first landed in 2023.1.0a14, and has been backported to 2022.2.0b15 and 2021.3.15f1.Variant prefiltering also helps cut down initial/clean build times by applying the same principle.Historically, the Unity runtime would front-load all shader objects from disk to CPU memory during scene and resource load. In most cases, a built project and scene includes many more shader variants than needed at any given moment during the application’s runtime. For projects using a large amount of shaders, this often results in high shader memory usage at runtime.Dynamic shader loading addresses the issue by providing refined user control over shader loading behavior and memory usage. This optimization facilitates the streaming of shader data chunks into memory, as well as the eviction of shader data that is no longer needed at runtime, based on a user controlled memory budget. This allows you to significantly reduce shader memory usage on platforms with limited memory budgets.New Shader Variant Loading Settings are now accessible from the Editor’s Player Settings. Use them to override the maximum number of shader chunks loaded and per-shader chunk size (MB).With the following C# API now available, you can override the Shader Variant Loading Settings using Editor scripts, such as:PlayerSettings.SetDefaultShaderChunkCount and PlayerSettings.SetDefaultShaderChunkSizeInMB to override the project’s default shader loading settingsPlayerSettings.SetShaderChunkCountForPlatform and PlayerSettings.SetShaderChunkSizeInMBForPlatformto override these settings on a per-platform basisYou can also override the maximum amount of loaded shader chunks at runtime using the C# API via Shader.maximumChunksOverride. This enables you to override the shader memory budget based on factors such as the total available system and graphics memory queried at runtime.Dynamic shader loading landed in 2023.1.0a11 and has been backported to 2022.2.0b10, 2022.1.21f1,and 2021.3.12f. In the case of the Universal Render Pipeline (URP)’s Boat Attack, we observed a78.8% reduction in runtime memory usage for shaders, from 315 MiB (default) to 66.8 MiB (dynamic loading). You can read more about this optimization in the official announcement.Beyond the critical changes mentioned above, we are working to enhance the Universal Render Pipeline’s shader variant generation and stripping. We’re also investigating additional improvements to Unity’s shader variant management at large. The ultimate goal is to facilitate the engine’s increasing feature set, while ensuring minimal shader build and runtime overhead.Some of our ongoing investigations involve the deduplication of shader resources across similar variants, as well as overall improvements to the shader keywords and Shader Variant Collection APIs. The aim is to provide more flexibility and control over shader variant processing and runtime performance.Looking ahead, we are also exploring the possibility of in-Editor tooling for shader variant tracing and analysis to provide the following details on shader variant usage:Which shaders and keywords produce the most variants?Which variants are compiled but unused at runtime?Which variants are stripped but requested at runtime?Your feedback has been instrumental so far as it helps us prioritize the most meaningful solutions. Please check out our public roadmap to vote on the features that best suit your needs. If there are additional changes you’d like to see, feel free to submit a feature request, or contact the team directly in this shader forum.

>access_file_
1067|blog.unity.com

Made with Unity: 2022 in review

As we reflect on the past year, we can’t help but be proud of all that the Unity community accomplished. From award-winning masterpieces and cult hits to the pure, unadulterated joy that Trombone Champ has brought to all of our lives. Case in point…Before we move into a new year, full of new possibilities, let’s take a moment to celebrate some of your biggest achievements from the past 12 months. Thank you for being part of our story; here’s to an even better 2023!To the best of our abilities, here’s a non-exhaustive list of Made with Unity games that you released in 2022, either into early access or full release. There have been so many great titles this year, so we’ve categorized them by genre to make this list a bit more digestible and hopefully inspire some of your future projects – though, of course, some titles defy easy genre categorization (*cough* Cult of the Lamb *cough*).See any on the list that have already become favorites or know of any that we missed? Tell us about it in the forums.Rollerdrome, Roll7 (August 16)Other action titles we love include:Sanabi, WONDER POTION (June 20) [Early Access]Cuphead: The Delicious Last Course, Studio MDHR Entertainment Inc. (June 30)Midnight Fight Express, Jacob Dzwinel (August 23)Warhammer 40,000: Shootas, Blood & Teef, Rogueside (October 20)Outshine, Fishing Cactus (November 3)SIGNALIS, rose-engine (October 27)Other horror titles we love include:The Mortuary Assistant, DarkStone Digital (August 2)Hell is Others, Strelka Games, Yonder (October 20)Bendy and the Dark Revival, Joey Drew Studios (November 15)TUNIC, TUNIC team (March 16)Other puzzle adventure titles we love include:FAR: Changing Tides, Okomotive (March 1)Syberia: The World Before, Microids Studio Paris (March 18)Lost in Play, Happy Juice Games (August 10)The Spirit and the Mouse, Albune Games (September 26)LEGO® Bricktales, ClockStone (October 12)The Past Within, Rusty Lake (November 2)How to Say Goodbye, Florian Veltman, Baptiste Portefaix, and ARTE France (November 3)Somerville, Jumpship (November 14)Ghost Song, Old Moon (November 3)Other metroidvania titles we love include:Infernax, Berzerk Studio (February 14)Haiku, the Robot, Mister Morris Games (April 28)HAAK, Blingame (August 24)Moonscars, Black Mermaid (September 27)Neon White, Angel Matrix (June 16)Other FPS titles we love include:Gloomwood, Dillon Rogers and David Szymanski (September 5) [Early Access]Isonzo, M2H and, Blackmill Games (September 13)Metal: Hellsinger, The Outsiders (September 15)Prodeus, Bounding Box Software Inc. (September 23)BONELAB, Stress Level Zero (September 29)CULTIC, Jasozz Games (October 13)Cult of the Lamb, Massive Monster (August 11)Other roguelike titles we love include:Have a Nice Death, Magic Design Studios (March 8) [Early Access]Across the Obelisk, Dreamsite Games (April 8) [Early Access]Rogue Legacy 2, Cellar Door Games (April 28)20 Minutes Till Dawn, flanne (June 8) [Early Access]Necrosmith, Alawar Premium (July 13)ORX, johnbell (August 30)Alina of the Arena, PINIX (October 13)I See Red, Whiteboard Games (October 24)Ship of Fools, Fika Productions (November 22)IMMORTALITY, Sam Barlow, Half Mermaid (August 30)Other narrative-focused titles we love include:NORCO, Geography of Robots (March 24)A Memoir Blue, Cloisters Interactive (March 24)As Dusk Falls, INTERIOR/NIGHT (July 19)Hindsight, Team Hindsight (August 4)Gerda: A Flame in Winter, PortaPlay (September 1)Beacon Pines, Hiding Spot (September 22)Pentiment, Obsidian Entertainment (November 15)Dortformantik, Toukana Interactive (April 28)Other city builder and strategy titles we love include:Diplomacy is Not an Option, Door 407 (February 9) [Early Access]Farthest Frontier, Crate Entertainment (August 9) [Early Access]The Wandering Village, Stray Fawn Studio (September 14) [Early Access]Terra Invicta, Pavonis Interactive (September 26) [Early Access]Moonbreaker, Unknown Worlds Entertainment (September 29) [Early Access]Stardeus, Kodo Linija, (October 12) [Early Access]Against the Storm, Eremite Games (November 1) [Early Access]IXION, Bulwark Studios (December 7)Citizen Sleeper, Jump Over The Age (May 2)Other RPG titles we love include:Unexplored 2: The Wayfarer’s Legacy, Ludomotion (May 27)Dungeon Munchies, maJAja (July 27)Backpack Hero, Jaspel (August 22) [Early Access]I Was a Teenage Exocolonist, Northway Games (August 25)Temtem, Crema (September 6)Lost Eidolons, Ocean Drive Studio (September 13)Gedonia, Kazakov Oleg (October 14)Chained Echoes,Matthias Linda (December 8)V Rising, Stunlock Studios (May 17)[Early Access]Other survival titles we love include:Arctico, Claudio Norori, Antonio Vargas (February 15)Core Keeper, Pugstorm (March 8)[Early Access]The Planet Crafter, Miju Games (May 24)Raft: The Final Chapter,Redbeet Interactive (June 20)Flat Eye, Monkey Moon (November 14)Other management titles we love include:Bear and Breakfast, Gummy Cat (July 28)PlateUp!, It’s happening (August 4)Two Point Campus, Two Point Studios (August 9)Arcade Paradise, Nosebleed Interactive (August 11)Dave the Diver, MINTROCKET (October 27) [Early Access]Aka, Cosmo Gatto (December 14)Other simulation titles we love include:Hardspace: Shipbreaker, Blackbird Interactive (May 24)Dinkum, James Bendon (July 14) [Early Access]PowerWash Simulator, FuturLab (July 14)CTRL ALT EGO, MindThunk (July 22)Disney Dreamlight Valley, Gameloft (September 6)Construction Simulator, weltenbauer. Software Entwicklung GmbH (September 20)Slime Rancher 2, Monomi Park (September 22) [Early Access]Turbo Golf Racing, Hugecalf Studios (August 4) [Early Access]Other sports or driving titles we love include:Olli Olli World, Roll7 (February 7)Shredders, FoamPunch (March 16)Blacktop Hoops, Vinci Games (April 19)[Early Access]Hot Lap League: Deluxe Edition, Ultimate Studio (August 23)You Suck at Parking, Happy Volcano (September 14)Trombone Champ, Holy Wow (September 15)Other funny titles we love include:Tentacular, Firepunchd Games UG (March 24)The Stanley Parable: Ultra Deluxe, Crows Crows Crows (April 27)Cosmonious High, Owlchemy Labs (May 31)The Looker, Subcreation Studio (July 17)The Last Hero of Nostalgaia, Over The Moon (October 19)MARVEL SNAP, Second Dinner (October 18)Other card titles we love include:Stacklands, Sokpop Collective (April 8)Card Shark, Nerial (June 2)Card Crawl Adventure, Tinytouchtales (August 3)That’s a wrap for 2022! Want more community news as it happens? Don’t forget to follow us on social media: Twitter, Facebook, LinkedIn, Instagram, YouTube, or Twitch.

>access_file_
1068|blog.unity.com

Get to know Unity’s Employee Resource Groups and the steps they’re taking to foster inclusion

At Unity, Employee Resource Groups (ERGs) are employee-led networks that foster an inclusive workplace by providing support around shared identities or life experiences that have historically been marginalized or underrepresented in the workforce. The purpose of ERGs is to champion an inclusive culture by modeling our company values of empathy, respect, and opportunity, and to build a sense of belonging. Membership in ERGs is open to all employees, regardless of whether they self-identify with the target demographic; allies are welcomed and encouraged to join in creating a culture of inclusion. Today we have a total of nine ERGs, nearly half of which were launched in 2022. Continue reading for a look back at all that our ERGs accomplished in the past year and to learn more about what each represents.The Access ERG was launched earlier this year and is a group that resonates with many people at Unity. The group hosted multiple meetings with guest speakers, covering various topics such as augmentative and alternative communication (AAC) literacy, guide dogs, and American Sign Language (ASL) interpretation. Leaders have also raised awareness and donations for nonprofits outside of Unity that are doing important work for communities with disabilities and neurodiversity.Employee feedback: “This is the most wonderful [update]. Thank you so, so much for researching, advocating, and sharing [recording transcripts].”In 2023, the Access ERG aims to normalize accessibility so that all team members with disabilities or neurological divergences have equal opportunities and rewarding careers. Seemingly small things, like having transcripts automatically enabled for Zoom meetings or introductions with visual descriptions, can make a big difference for someone with an auditory or visual impairment, respectively. They also plan on aligning with our accessibility council, focused on Unity’s product accessibility, to ensure that employees with disabilities are core stakeholders.The Asian ERG spent the past year recruiting new members through its monthly community meetings and also built an excellent event lineup. The group kicked off the year with a bang by putting together a video that featured Unity employees from around the world who each shared how Lunar New Year is celebrated in their respective cultures. Asian American and Pacific Islander (AAPI) Heritage Month is observed in North America during the month of May, so the Asian ERG organized four virtual events to raise awareness, including a Stop AAPI Hate presentation by Dr. Russell Jeung. In October, members gathered locally in offices with the Hindu and Sikh community to paint Diya lamps, and to taste locally sourced traditional food and sweets in celebration of Diwali, the Festival of Lights.In 2023, the Asian ERG leadership team aims to drive internal engagement and continue to grow community membership, as well as to continue collaborations with other ERGs such as the LGBTQ+ and Muslim groups by co-presenting discussions and events that tackle intersectional issues. The group also wishes to build partnerships with other companies and non-profit organizations, so as to create volunteer and networking opportunities for Unity’s Asian community. Finally, they are passionate about helping members develop and grow internally and are looking into ways to provide resources that can help guide the Asian diaspora in navigating their careers and to implement a mentorship program to promote a culture of learning.B-United, Unity’s Black ERG, has grown extensively over the past year. The group celebrated several amazing events for Black History Month in February (U.S. and Canada) and October (United Kingdom), and continued to honor Black culture throughout the year. An event that stands out was a fireside chat featuring one of Unity’s Black board members, whose story was impactful to hear for all who attended. The company was also able to send several B-United members to AFROTECH, which offered a unique chance to see and network alongside Black professionals across many areas of the tech industry. The conference was also the first time many of our group members met in person, since the ERG was established during the pandemic in 2020.In 2023, B-United plans to continue the momentum they’ve built and provide even more employees with opportunities to get involved. Creating psychological safety for our Black employees is essential for them to be able to thrive at Unity. This ERG can help be another space outside of someone’s immediate team for community building and career support.The mission of the Caregivers ERG is to partner with Unity to advance understanding and inclusion for employees who find themselves frequently in a caregiving role, be it for children, parents, or for other meaningful relationships in their lives. Founded this year, the group hosted a virtual volunteer event in partnership with our Social Impact team to send care packages to children in hospitals this holiday season. Balancing the care of others with impactful, meaningful work can be taxing for employees, and this ERG is dedicated to bringing resources, policies, and appreciation to its community so that, as a company, Unity is better able to retain, support, and provide equal opportunities for caregiving employees.Employee feedback: “I want to thank everyone for joining the kick off meetings this week. It’s really nice to meet new people at Unity and feel less alone.”Unity currently offers a variety of benefits for caregivers including family planning support and a flexible return to work program in the US and parental leave. In 2023, the Caregivers ERG intends to provide resources to make the return to work easier for employees who take leave for caregiving and to make flexible schedules a top priority.ComUnidad, Unity’s Latinx ERG, saw great improvements in its membership numbers, overall engagement, and group structure in 2022. The community grew to over 200 total members and attributes this growth to the time spent building a solid leadership structure and foundation to support the functions of the group. Growing its leadership team from four to nine this year allowed them to engage more with Unity’s Latinx community through events and community gatherings and to avoid leader burnout. Most notably, the group had a strong showing during Latinx Heritage Month (observed in the U.S. from September 15–October 15) when it hosted five events and increased its overall event engagement by more than 10%.In 2023, ComUnidad plans to continue its efforts to grow membership by offering more incentives for new Latinx employees and allies to join the ERG. This includes plans to provide more professional development opportunities and in-person events.What a busy year it has been for the LGBTQ+ ERG: The group organized more than a dozen events, many of which were in collaboration with other ERGs, such as a short film screening with the Asian ERG and Queer Icons trivia with the Latinx ERG. The team also hosted workshops on being an ally, a talk about the history of Drag Kings, a fireside chat with its executive sponsors, and launched a book club. In June, the group created a collective video to show what Pride really means to members of the community within Unity.Employee feedback: “I want to say how pleased and grateful I was to see [a] company-wide post specifically call out the LGBTQ+ community. A lot of companies tend to [take a much more vague] ‘Pride for everyone!’ approach – the acknowledgement of Pride's history is much appreciated, so thank you ”One of the LGBTQ+ ERG’s biggest goals for 2023 is to improve the sense of community for its members, as well as to offer opportunities for personal and professional growth. Group leaders are proud to have increased membership by over 12% in 2022, especially after a focus on gaining more global representation, but recognize there is still work to be done.The Muslim ERG is the first of its kind at Unity and a unique one in the tech industry, too. Through spreading awareness about issues that impact the community and Islamic culture, the group aims to set up frameworks that ensure that Muslim employees can practice their beliefs with respect and comfort. Leadership has collaborated with other ERGs, such as the Asian and Women ERGs, on issues that affect the intersectional communities. They’ve also hosted celebrations for Ramadan and Eid, including three successful fundraisers (tied to the internal employee match program) and informational sessions about Ramadan and how to be “in it together” for all. The fundraisers centered around food drives for war-torn countries and rebuilding education infrastructure in disaster-stricken nations due to global warming. As part of its end-of-year festivities, the Muslim ERG hosted baklava cooking and arabic calligraphy workshops to bring a bit of the Arabic culture to the diverse workspaces of Unity, and to make members feel more connected during the holidays.In 2023, there is great opportunity for this ERG to make an impact on Muslim employees and to create more allies and intersectional collaborations. The group’s leaders aim to work with internal teams to support onboarding of new employees who have been hired in or are relocating from Muslim-majority countries by introducing buddy systems, as well as to create inclusive workplaces globally for current employees. Together, the Muslim ERG looks forward to another good year for inclusion and diversity at Unity.The Service Members ERG is designed for veterans and military families, and chose a name and logo that would appeal to Unity’s global employee base. Members of the ERG believe service members and their families have unique skill sets and experiences to offer. Formed over the summer of 2022, the group hosted its first open workshop, led by a U.S. Army combat veteran, on how to cultivate a resilient mindset in an ever-changing world in September. To close the year, they hosted a children’s toy drive through Toys for Tots, which is run by the United States Marine Corps Reserve.In 2023, the Service Members ERG would like to grow its member base and bring more allies into the fold. With the new acquisition of ironSource, and the majority of its employees having served in the Israeli Defense Force, we believe there are many peers who would be interested in connecting with and learning from this community.What a year it has been. The Women ERG started off 2022 by introducing its first executive sponsors, who aim to build a continuous partnership in order to realize desired objectives and key results for women at Unity. The highlight of the year was, of course, the group’s impressive lineup of events during Women’s History Month in March, which included an influential Unity Women Panel, a showstopper that inspired all who tuned in and was supported by women leaders within the company. Another focus area for the ERG this year was to provide support for professional development, which led to the launch of its Peer Mentor program. Finally, the Women ERG announced the rollout of its Ambassador program, which is designed to provide additional support for events for employees in both virtual and in-person settings, and even found time to volunteer.As it looks to 2023, the Women ERG wants to support its members as best they can and iterate and improve processes. The leadership team is excited to show members all that they have prepared in an effort to bring more focused events and to increase engagement among women at Unity.We look forward to continuing to foster inclusion and diversity at Unity through these and more employee initiatives and extend a huge thank you to all who participated in ERG programming – both internally and externally – throughout 2022. Cheers to all that we can continue to accomplish in 2023!If you are interested in learning more about Unity’s Inclusion programs, check out our Inclusion & Diversity page. You can watch videos produced by our ERGs on our LinkedIn Life page or read through our Faces of Unity blogs, highlighting individual ERG leaders.

>access_file_
1070|blog.unity.com

Havok Physics for Unity is now supported for production

Announced back at the Game Developers Conference (GDC) 2019, Havok Physics for Unity was initially distributed as an experimental package on the Unity Asset Store. Now, with the availability of ECS for Unity (Entity Component System) in the Unity 2022.2 Tech Stream, Havok Physics for Unity is officially supported for production. In fact, we’ve made this package available to all Unity Pro, Enterprise, and Industrial Collection subscribers for free.Havok Physics for Unity is built on the same foundation of technology that powers many of the world’s leading game franchises like Destiny and Assassin’s Creed, among others. When we first set out to define what the future of physics could look like with our Data-Oriented Technology Stack (DOTS), we sought a partner that shared the same core concepts and values as us. Through our partnership with Havok, we were able to leverage DOTS to deliver the highly optimized, stateless, entirely C#, and performant Unity Physics we know today.We also prepared for more complex simulation requirements for users who might need a stateful physics system. We knew that Havok would be the perfect solution to integrate into Unity for those high-end simulation needs.The Havok Physics for Unity package is written using the same C# ECS framework as Unity Physics, and is backed by the closed-source, proprietary Havok Physics engine, written in native C++. Havok Physics for Unity is heavily optimized for many typical gaming use cases. For example, core algorithms have been refined over many years with various automatic caching strategies (including the sleeping of inactive objects), meaning that CPU resources are spent only as needed.Since the experimental package, Unity and Havok have been working together with early users of the plug-in to drive improvements and add new features.Here’s a breakdown of what’s new:Havok Physics for Unity is now available for free for all Unity Pro, Enterprise, and Unity Industrial Collection subscribers.Havok Physics for Unity is based on the 2021.2 version of the original Havok SDK, and brings more stability and performance to the Unity plug-in.As part of the full release, we’ve incorporated support for motorized joints (motors), such as linear position, as well as linear, rotational, and angular velocity.We’ve also added new methods to the HavokSimulation API, which enables granular stepping for your simulations, plus methods for accessing Havok simulations more efficiently with singletons.Check out the complete changelog of updates to Havok Physics for Unity.Havok Physics is a robust physics engine designed to handle the performance demands of the most dynamic games, often involving complex scenes with lots of physical interaction. By working with partners across the industry for over 20 years, Havok has encountered, solved, and continued to iterate on some of the toughest problems facing real-time physics simulation. This investment has led to the stable stacking of physics bodies, minimal artifacts for fast-moving bodies, and generally more controlled behavior, especially when it comes to non-optimized collision geometry.Of course, physics means action, so let’s see how these two creators are currently taking action with Havok Physics for Unity.Title:Hostile Mars Studio:Big Rook Games Studio size:Individual Platforms: Windows PC, console Genre: Open-world, base-building automation tower defense Players: Single playerFirst presented at PAX – East in 2022, Hostile Mars drew in quite an audience with its dedicated use of physics and unique blend of genres. The vast Martian landscape is brought to life in an open-world factory base-building and automation game. Players confront each other at the ground level in close combat to defend their factories through both third-person shooting and programmable defenses.Jake Jameson, founder of Big Rook Games, began by blending genres across multiplayer and single-player games – adding elements from puzzlers along with first-person shooters (FPS) and wave-based shooters. The more he evolved this idea, the more he narrowed his focus on creating a single-player base builder. He soon discovered the ideal balance between building, strategy, and combat, finally leading up to the one and only Hostile Mars.When completing that final iteration of Hostile Mars, it was clear the game needed to use some sort of data-oriented programming model. Jake wanted to have large enemy waves while still achieving high-end visuals. In order to provide the players with the best possible experience, the game had to run performantly on a massive scale, supporting thousands of enemies simultaneously onscreen.To meet these demands, Jake turned to Unity’s Data-Oriented Technology Stack. In leveraging ECS for Unity, every enemy in Hostile Mars could run real-time Mesh Physics/Collisions, A* Pathfinding, and Local Avoidance, in addition to robust state systems, animations, weapon and projectile systems, high-quality VFX, particle systems, and more.“Without DOTS, I wouldn’t have been able to provide the experience that I imagined in my original design. It just wouldn’t have been possible without implementing my own ECS framework, and as a solo dev, this isn’t viable given my timeline and budget.” – Jake Jameson, founder of Big Rook GamesAlthough Jake is not a game developer by trade, he is an avid gamer and was already familiar with the original Havok Physics engine. Knowing how trusted Havok’s technology has been among AAA studios, Jake felt confident implementing it as soon as it became available in Unity via the experimental package.Hostile Mars is a physics-intensive experience. The player uses physics-based traps and turrets to manipulate the physics properties of enemies. By applying different physical properties to enemies, the goal is to stop them in their tracks or drive them toward more dangerous traps.Not to mention that Hostile Mars involves an incredible number of enemies. More specifically, there are up to 5,000 individual physics-based enemies that can flood the player’s Martian factories, which leads to hundreds of collisions and projectiles onscreen at once.All of the physics in Hostile Mars utilizes Havok Physics for Unity; from the collisions between projectiles and enemies, players and enemies, even enemies and other enemies, to enemies and the landscape itself, and hovering enemies who require a constant state of force to stay afloat. These distinct physics interactions take place in real-time, with physical simulation that allows for believable hovering, gravity, and mesh point collisions. So when players strike their enemies at a particular point, they will see them spin away just as Havok Physics for Unity intended them to.Not only are the enemies entirely physics-based, but the traps that are necessary for gameplay are physics-based too. There are gravity traps that push and pull enemies, which in turn, slow down or speed up traps that smash enemies. There are even traps that spring spikes up into an enemy’s path, not just pushing them aside, but realistically simulating the force and velocity of the spikes so that the enemy reaction appears authentic.Once the players advance, the enemy waves become varied and increasingly complex, which requires players to build traps more strategically, combining their different physics-based interactions to herd enemies with different weaknesses toward stronger traps and turrets that can impact them most. For example, while a frost trap might slow some enemies down, an explosive trap can deal the most damage possible against a highly concentrated wave.Plans for releasing Hostile Mars on consoles are still in the works, but in the meantime, you can add Hostile Mars for Windows PC to your Wishlist on Steam today.Title: Robocraft 2 Studio:Freejam Games Studio size:25 Platforms:Windows PC, console Genre: Online vehicular combat Players: 5v5 online multiplayerRobocraft 2 is the free-to-play sequel to 2017’s award-winning Robocraft, where players build customizable robot battle vehicles that drive, hover, walk, and fly in an open-world multiplayer environment. Since this initial success, the team at Freejam Games has refined a fully customizable experience for Robocraft 2, so that players can bring their own creations to competitive multiplayer gameplay.As Freejam Games experimented with projects following the success of Robocraft, they focused on providing exciting new building tools. This way, players could enjoy more freedom to design complex, physics-simulated creations.The team evaluated how moving the physics from the client side to dedicated multiplayer game servers could enhance the physical interactions between the vehicles and robots created by the players. They discovered that relegating the physics to the server created a range of fun gameplay moments, wherein weight, inertia, momentum, friction, mass, and bounciness were all accurately simulated. In other words, the heavy vehicles could easily push lighter ones, or be combined with joints like pistons, servos, and rotating platforms. Even weapons and explosions applied realistic kickback and force when they hit. All of these experiments, alongside community testing and feedback, culminated in Robocraft 2.In Robocraft 2, players now get to create complex vehicles, take them into battle, and destroy them in 5v5 team battles online. From their experience with the first Robocraft game, the Freejam team knew how competitive and creative their players could be, finding new ways to optimize the provided building tools in order to win battles.This meant that the team had to rely on three key features from their physics engine in order to provide a fair experience for all:Fast performanceRobust, non-glitchy physics simulationAccess to low-level areas of the physics engine to make modifications in the gamePrior to the 2017 announcement of DOTS and ECS for Unity, Freejam Games explored the possibility of building their ECS framework in-house. They then quickly adopted ECS for Unity in its experimental release, starting with Unity Physics. For experiments with server-side tech, they used determinism as a solution for keeping the player clients and simulation on the server in sync, while Unity Physics (which is deterministic) provided the performance.As the game evolved, they moved away from a stateless approach and became early adopters of Havok Physics for Unity. As a stateful system, Havok Physics for Unity ultimately powered the performance of the simulation within the gameplay requirements for Robocraft 2.“The high performance of Havok [Physics for Unity] allows us to have accurate server-side physics in our online game. In turn, that provides several significant benefits including giving all clients an equal representation of the physics for a better quality experience. The fact that the server is authoritative over the simulation also has the added benefit of reducing the opportunity for cheaters.” – Ed Fowler, principal programmer and cofounder of FreejamHavok Physics for Unity helped Freejam Games solve complicated issues. For example, as players defeat opponents, the robots and vehicles fall apart block by block, which can create hundreds of Rigidbodies in the environment. To free up CPU and maintain high frame rates, those inactive Rigidbodies can be put to sleep via Deactivation, a feature that effectively removes the physics from broken pieces temporarily.Player creations in Robocraft 2 consist of many Rigidbodies with Compound Colliders constrained together by joints, which can be smashed or stacked on top of one another. Further improvements to overall performance of the physics simulation can be gained with Collision Caching, which additionally allows for refined simulation of joints/constraints, such as in those stacking situations.Lastly, the Havok Visual Debugger was used to visualize the collision in the game world in real-time. It enabled Freejam Games to identify glitches, snags, and efficiently spot instances where rogue contacts arise. This accelerated their workflow and prompted fast fixes.Want to see Robocraft 2 in action? Add it to your Wishlist on Steam.To help you get started, check out the ECS Physics Samples on GitHub.If you need more guidance, we created a tutorial to help you learn more about Unity’s physics options, including Havok Physics for Unity.By the end of the tutorial, you will be able to do the following:Describe the key benefits of Havok Physics for Unity and Unity PhysicsExplain the relationship between Unity Physics and Havok Physics for UnityIdentify situations where physics solutions for ECS are the right fit for a projectThis tutorial is an introduction to physics solutions in ECS for Unity, tailored to users with an intermediate or advanced level of experience with the Unity Editor. As mentioned earlier, DOTS is Unity’s Data-Oriented Technology Stack, a suite of data-oriented technologies for users looking to make complex projects with highly optimized performance. If you want to learn more about DOTS, we recommend the newly-released DOTS Guide on GitHub.We are also actively engaging with many of you on the DOTS channel, part of the Unity Discord, and in the forums. We look forward to learning more about the projects you’re building with Havok Physics for Unity.

>access_file_
1072|blog.unity.com

Going off autopilot in ad monetization: 4 innovative strategies to start implementing

It's easy to stick with strategies that work - but incremental growth comes from balancing exactly that with constant testing and experimentation. That's why at Appfest 2022, Samantha Benjamin, Director of Growth at Supersonic, explored four ways you can break old patterns and be more experimental with your monetization strategy - or, as she puts it, “get off of autopilot."Get inspired by successful creativesFirst, Samantha recommended getting inspired by features in what she calls “booster creatives,” or creatives that give you 3-5x more installs for the same cost as other creatives (and can even change the marketability power of your game).For example, her team saw that creatives with realistic obstacles performed significantly better than creatives with cartoonish ones - so they decided to take those realistic obstacles and actually add them to their game Going Balls. As a result, LTV grew on both iOS and Android, and D7 ARPU jumped 5-7%.As your creative team finds these “boosters”, make sure they pass them directly to your monetization team. This way, with every new idea, you can consider and optimize any potential monetization opportunities.Building a sophisticated interstitial player experienceThough interstitials are a major source of revenue, the potential impact on retention sometimes deters developers from monetizing with them . So to ensure players have the best possible interstitial experience, it’s critical to adjust it to your players’ engagement behavior.No-touch interstitialsFor example, when a player hasn’t touched their screen for at least 20 seconds, Supersonic displays what they call “no touch interstitials”. This player is likely taking a break - but they’re going to return to their phone eventually, so the ad will be the first thing they see. CPMs are high with this placement, and it’s a win-win - LTV is high, advertisers get installs, and players can enjoy a more sophisticated interstitial experience.Before or after end-level screenAdditionally, Supersonic tested adapting interstitials for users who reject rewarded video offers. Usually, developers just show interstitials when this happens, but clearly these users don’t want to engage with ads. Supersonic tried a different approach: showing interstitials to this segment during natural pauses in the game, like commercials. They tested this by putting interstitial ads for these users right before or after the end-level screen - and engagement boosted as a result.Looking at other genresNext, try broadening your sources of inspiration - beyond just games competing in your genre. Other kinds of games might seem drastically different, but if they have similar motivations, they can be an ideal learning opportunity.Highlighting progress with a leaderboardInspired by PvP games, Supersonic decided to add an automated leaderboard that pops up at the end of their hyper-casual games. By creating a competitive atmosphere and displaying players’ progress, they boosted LTV and ARPU lifted 12% on average.Celebrating wins with confettiThe Supersonic team noticed other genres creating excitement within their games, so they decided to add a burst of confetti in their games whenever players achieved something (like shooting a basketball through the hop). Simply by emphasizing their players’ success and making them feel like winners, Supersonic saw their ARPU jump by 15%.The power of musicAs Samantha puts it: “never underestimate the power of music,” especially in ad-oriented games. When Supersonic tested incorporating more music into their games, they saw a 10% ARPU uplift - simply by tweaking the music and testing different volumes and sound effects.Staying open to ideasFinally, Samantha explains the value of having dedicated time to think of new ideas. In fact, when one growth operations manager at Supersonic pitched an idea, it inspired a real change in their games: timed treasure chests. To increase session length, as the user was approaching the average session length, they would see a pop-up chest with a timer - encouraging the user to keep playing and wait for their prize to be available. This proved so successful at increasing session length that Supersonic implemented this into three of their biggest games.Ultimately, new monetization ideas can come from anywhere and everywhere - so it’s crucial to stay on the lookout, trust your data - and, when the opportunity strikes, don’t be afraid to try going off autopilot.Watch the session here: https://www.youtube.com/watch?v=RMqiWKAFENY

>access_file_
1073|blog.unity.com

It’s all in here: The ultimate guide to creating UI interfaces in Unity

Thousands of people have preregistered and now it’s finally here: Our biggest e-book yet, User interface design and implementation in Unity, is available to download. Get ready to dive into over 130 pages of advanced instruction in UI design.Your game’s user interface is perhaps the most direct way you can communicate with and guide your players – like a folded map you hand to them that reveals clues, key details, and directions as they progress. Whether you’re using more traditional elements like health bars and pop-up messages, or elements completely embedded in the game world, such as showing stats on the back of a player’s survival suit, the UI is integral to immersing players in your game’s story, realm, and artistic style.We’re thrilled to announce that our latest technical e-book, User interface design and implementation in Unity, is available to download for free. Thousands of people have already signed up for it, and just as many have downloaded its companion piece, the demo project, UI Toolkit sample – Dragon Crashers, to date. Now it’s your turn.The interest in this e-book is understandable. As it says in the introduction, “User interface is a critical part of any game… a solid graphical user interface (GUI) is an extension of a game’s visual identity… [and] modern audiences crave refined, intuitive GUIs that seamlessly integrate with your application.”The guide begins by covering UI design and art creation fundamentals, and then moves on to in-depth instructional sections on UI development in Unity. Written and reviewed by technical and UI artists – external and Unity professionals alike – the e-book unpacks both Unity UI, the default solution, and the newer UI Toolkit.The emphasis, however, is on the latter toolset, as UI Toolkit now provides many benefits for projects with complex, fullscreen interfaces. Think of projects that require a scalable and performant system for runtime UI. To help you choose the right solution for your project, please refer to this section of the Unity manual.The e-book is a treasure trove of information for professional UI designers, artists, and other Unity creators who want to deepen their knowledge of UI development. Here’s a snapshot of what’s inside.The first section aims to inspire with foundational tips for making effective UI. It looks at examples of diegetic UIs, where UI elements can be found right in the story, making parts of the game world function as a user interface. It explains how elements can either contribute to or break the immersion that a player experiences. We even turned this section into a blog post that you can read here.The guide then turns to the roles and responsibilities of a UI designer, and what tools and methods they employ such as UI wireframing, art creation through mockups, fonts, and grey-boxing. There’s also a chapter on asset preparation and exporting graphics from Digital Content Creation (DCC) tools. These earlier sections in the guide are helpful no matter what game engine and UI solution you’re using.An extensive chapter is devoted to Unity UI. Unity UI is our longtime system for creating in-game UIs, and currently the go-to solution for positioning UI in a 3D world or using GameObject-based Unity systems.This section outlines Unity UI fundamentals for prototyping and integrating assets in-Editor: the Canvas, prebuilt UI elements, TextMesh Pro, and Prefabs, among others. We recently updated an article on advanced optimization techniques for Unity UI, where you can find tips on related topics.UI Toolkit is made for maximum performance and reusability with workflows and authoring tools informed by standard web technologies. UI designers and artists will likely find it familiar, especially with prior experience designing web pages.Three major sections of the guide highlight instructions for developing runtime UI with UI Toolkit. There’s a thorough explanation of the parts that comprise UIs made with UI Toolkit, including the Unity Extensible Markup Language (UXML) and Unity Style Sheet (USS) using UI Builder.You’ll explore how UI Toolkit positions visual elements based on Yoga, an HTML/CSS Layout engine that implements a subset of Flexbox. Flexbox architecture provides advantages, such as responsive UI, enabling you to adapt your UI to different screen resolutions and sizes. Through both UXML and USS, you can decouple the styles applied to UI layouts (and switch those styles up as needed), while logic and functionality continue to live in code. Workflows for visual elements, the fundamental building blocks of each interface, are also discussed in great detail – from positioning, size, and alignment settings, to margins and padding.The chapter on styling shows you how to define reusable styling for visual elements with Selectors, override styles and define unique attributes with inline styles, and create animations and effects with USS animation and a Camera Render Texture. It also demonstrates how you can thematize UI elements for holidays and other special events.That’s when the e-book gets into UI Toolkit sample – Dragon Crashers, with different sections that depict how the UI was made; from the menus and custom controls like radial counters or tabbed views, to embedded UXML templates and more.Finally, the guide concludes with a mini profile of the studio Mechanistry’s UI migration to UI Toolkit for their new game, Timberborn. This brief study showcases how their lean team managed to scale and keep their game consistent across various menus and screens.At 137 pages, the UI e-book is not a light read. As with the other technical e-books released this past year, use it as a reference on an ongoing basis.Along with the e-book, check out a couple of recently released resources filled with useful tips for leveraging Unity UI and UI Toolkit:The Unite 2022 session, Extending the Unity Editor with custom tools using UI Toolkit, shows programmers how to use UI Builder to create a custom Inspector for real-time Play Mode debug data visualization.The webinar, Best practices for mobile UI design, gathers experts from Outfit7, Samsung, and Unity to share strategies for maximizing the creativity and flow of your mobile games.Bookmark one or both of these pages. They compile all of our technical e-books and advanced content:Unity best practicesAdvanced best practices – Unity ManualWe hope that you enjoy this latest e-book and look forward to your feedback in this forum.

>access_file_
1074|blog.unity.com

Building better paths while maintaining creative flow with Splines in 2022.2

Unity 2022.2 includes updates to the Splines package, accessible through the Package Manager, which offers you the ability to draw and use spline paths in your game or other creation. For developers, this means you can easily build out rivers, roads, camera tracks, and other path-related features and tools. If you’re an artist, you get a consistent, Unity-supported experience across all these toolsets using our Splines solution. Several default components are also included with the Splines package, so you can use this new artist tooling right away.If you’d like to jump right into learning and discussing the new Splines package, head over to the Unity Splines forum post.A “spline” is a type of path that is often used in both 3D and 2D creative tools. Essentially, you set a few points as if you are mapping out a road, then optionally tweak how the path curves around those points, and, finally, connect more branching points if you need them… that’s your spline!Splines are often used to:Create rivers and roadsSet camera tracksDefine areas or shapesThe Splines package enables you to create and use splines as easily as you would any other object. Open the GameObject menu to create a spline, then add whatever components you want to use that spline’s path.As an artist, this means you just need to learn one set of tools to draw roads onto your terrain, define camera paths, or extrude mesh shapes for level design. Even better, the spline you draw for your camera can be reused to place a path on the ground, to navigate characters, or anything else. Just add or swap out components as needed.For developers, the Splines package provides a robust and standardized framework to build on. Create your own custom components or Unity Asset Store packages. For more information, see the Splines API documentation.Create or open a project on Unity 2022.2 or later, then install the Splines package using the Package Manager.To create a spline:1. From the top menu, select Create > Spline > Draw Spline.2. Click in the Scene view to place points for your spline. If you want to add a curve to the path, click and drag when you place a point.3. When you are done drawing, press Escape, or select a tool in the Tools overlay.4. Use the Editor’s standard select and transformation tools to edit the shape of the spline. For more information, see the Splines documentation.Splines is the first major feature to use our new tooling system, contextual workflows. Contextual workflows use overlays to get you the right tools at the right time. You can see them in action in these cases:Simplify editing with Tool context: Select a spline to see an icon appear at the beginning of the Tools overlay. This icon indicates the Tool overlay’s Tool context. Click the Tool context icon to change your Tool context from GameObject to Spline. Now you can dive right into editing the spline’s finer details using the standard Editor tools and controls.Customize with Tool Settings: Activate a spline tool to see new options in the Tool Settings overlay. This shows you which options are available so you can pick what you need on the fly.Discover new tools with component tools: Select a spline, and if that spline has any component tools, they appear at the bottom of the Tools overlay. If a package or asset uses components to add new tools, you can find them at the bottom of the Tools overlay – no need to search through the Editor.The best part here is that contextual tooling can work with any toolset in Unity, including the Unity Asset Store or other custom creations. If you’re a tool developer and need help setting this up, start with the tools documentation, or reach out on the Unity Forums.Speaking of components, we’ve included three to meet the common uses cases:Instantiate: Generate copies of an item along a spline. Use the Instantiate component to create objects like fences, trees, stone walkways, and so on.Animate: Move a GameObject along a spline. Use the Animate component with cameras, characters, or in situations where you need to define movement in Unity.Extrude: Build a tube mesh along a spline. Use the Extrude component to create and easily edit shapes like wires, pipes, ropes, noodles, and more.New in Splines 2.1, you can build splines with multiple, branching paths. Activate the Draw Splines tool, and begin drawing new parts onto the spline. This also enables you to create disconnected spline sections.You can directly manipulate splines quickly without having to hunt through menus for the right transform tool or gizmo. When you are editing spline points, click-drag a point to move it. No tool activation needed! This is designed to bring 2D-like simplicity of editing to splines.When editing spline points, these new options are available in the handle orientation dropdown. Parent enables you to move, rotate, or scale items relative to their parent element. Element gives you precise editing using the selected item’s directionality.Splines was built to be a foundation for other tooling, especially tools from the Unity Asset Store and custom creations. The package includes a robust API and samples for developers to learn from or customize. Check out the Splines API documentation to get started.The Splines package has been publicly available for almost a year now. Your continuous input has been fantastic, and we’re excited to hear even more, especially from artists, with this major update. Comments here are great, and for deeper discussions we hope to see you on the Splines forum thread.

>access_file_
1076|blog.unity.com

Unity 2022.2 Tech Stream is now available

I’m delighted to share that the 2022.2 Tech Stream, our final release of the year, is available for download.Tech Stream releases allow you to go hands-on with early access to the latest features. It’s also an opportunity to share your feedback on how we can build even better tools to power your creativity.Most recently at Unite, we gathered with our community of game developers to share some of these updates on topics like DOTS, rendering, multiplayer development, and XR, and we celebrated Made with Unity games like V Rising, Pentiment, Breachers, and many more. The dialogue online from over 9,000 Discord messages and countless in-person conversations was invaluable to shaping the future of Unity.Coupled with the 1,470 new forum threads where we discussed product feedback with you since the 2022.1 Tech Stream arrived and the 3,080 new notes on the Unity Platform roadmap, this feedback helped us get to today’s release. We couldn’t have done it without you and are excited to get that work in your hands. To learn more about how your feedback drives product development, check out this blog post.Together with the first Tech Stream, today’s 2022.2 completes this year’s cycle. Join us and explore what’s in store ahead of the LTS release in 2023. For even more on where Unity is heading, I encourage you to read our Games Focus blog series.In this post, I’ll be sharing a few highlights from this release, but you can always get more details in the official release notes.A frequent request we receive is to give you the ability to create more engaging gaming experiences, deeply immersive worlds, and to do so with more objects and characters than ever before.Unity 2022.2 includesECS for Unity (Entity Component System), a data-oriented framework that empowers you to build more ambitious games with an unprecedented level of control and determinism. ECS and a data-oriented approach to development put complex gameplay mechanics and rich, dynamic environments at your fingertips. Starting with Unity 2022.2, ECS for Unity is fully supported for production, so you can get even more out of ECS through support channels and success plans.ECS for Unity includes the Entities package, along with ECS-compatible packages for Netcode, Graphics, and Physics. If you’re already familiar with Unity’s GameObject architecture and scripting standards, ECS for Unity is fully compatible with GameObjects, so you’ll find a familiar authoring experience and streamlined workflows. This gives you the capability to leverage your existing skill set and leverage ECS only where it will best benefit your game experience.We’re already seeing some great games running on ECS for Unity, such as Stunlock Studios’s V Rising. Because they turned to ECS, they were able to vastly increase the number of in-game interactable assets to more than 160,000 across a 5km2 map, with more than 350,000 server-side entities powering the experience.If you’re looking for help, want to provide feedback, discuss best practices, or show off your projects, you can join a thriving community on our forums and Discord. Our teams regularly engage in these channels and keep a close eye on your feedback. Join us on December 8, 2022 for our Dev Blitz Day dedicated to DOTS, when we’ll be spending an entire day trying to answer all your ECS questions.The last 18 months have seen an explosion of multiplayer experiences being built with Unity, and we hear that many of you want to add multiplayer access to your games but aren’t sure where to start.Alongside Unity 2022.2, we’re highlighting Netcode for GameObjects, a package that simplifies the implementation of multiplayer capability to your project in a number of scenarios such as couch cooperative play. The package works with familiar GameObject-based programming techniques, and it abstracts away low-level functionality so you can write less code while creating the multiplayer experience you envision.For more demanding, large-scale games, you can harness the power of ECS with Netcode for Entities. Netcode for Entities can enable you to increase your game world size, player counts, and complex network interactions without the performance sacrifices developers have traditionally had to deal with.We also recently announced the launch of self-serve capabilities in our Multiplayer Solutions suite within Unity Gaming Services (UGS), which helps you to operate your multiplayer games with hosting, communications, and more. Learn more about the latest developments for this tech in this Games Focus blog, or take a deeper look at the UGS Multiplayer suite in this UGS video, produced in collaboration with Tarodev.Multiplatform scalability and high-fidelity graphics continue to be our focus for rendering. In our Games Focus blog “Rendering that scales with your needs,” we covered our dedication to delivering features that allow you to scale with confidence while tapping into an even broader range of tools that provides the best possible visual quality and performance.We continue to bring the Universal Render Pipeline (URP) closer to feature parity with Built-in Render Pipeline through more streamlined and scalable workflows. We worked on key tools such asForward+, which provides functional parity with Forward path in Built-in Render Pipeline, eliminating the light limit count so you can scale with quality across platforms.Another key feature is Decal Layers,which allow you to filter and configure how different objects are affected by Decal Projectors in a scene. Decals are useful for adding extra texture details to a scene, especially to break the repetitiveness of materials and their detail patterns.Other special URP enhancements include LOD crossfade for smoother transitions and Built-in Converter improvements that provide you with tools to upgrade your existing projects from the Built-in Render Pipeline to URP. You can also personalize your rendering experience with Shader Graph Full Screen Master Node and Custom Post Processing across both renderers.Diving into High Definition Render Pipeline (HDRP), we’ve made enhancements that help you create even more beautiful physically based environments and detailed characters. You can scale high-fidelity environments with the new HDRP Water System to render oceans, rivers, and underwater effects, and use Volumetric Material to create procedural local fog using Shader Graph. Create even more realistic skies with improved Cloud Layers dynamic lighting, and you can even blend between different Volumetric Cloud conditions.You can also take your cinematic renders further to render realistic characters with Eye Cinematic with Caustics and PCSS shadows. HDRP Path Tracing Denoising provides you the choice between NVIDIA Optix™ AI accelerated denoiser and Intel® Open Image.Watch our latest Unite 2022 session on Lighting Environments in Unity to discover some key tips to get you started with our latest HDRP environment tools.Creative endeavors are never linear, and we understand that rapid iteration is part of the journey. This release includes new authoring features and workflow improvements to help speed up your productivity.For example, the Prefab system sees a number of upgrades, including the ability to quickly replace a Prefab Asset for a Prefab instance in a scene or nested inside other Prefabs. Read our latest blog on this topic for more information.For faster environments, the Paint Detail brush in the Terrain Tools package now allows you to simultaneously scatter multiple types of details with per-detail-type density settings available. Additionally, detail density and a few other Terrain settings are now overridable in the Quality settings to help you achieve platform performance targets.You can also use improved tooling and API features for Splines to help draw paths in your environments with greater precision. This means you can build out rivers, roads, camera tracks, and other path-related features and tools more efficiently. Thank you to all who engaged with us in the worldbuilding forums in the last couple months to help us finalize this delivery. For more on the API features, check out the documentation.Finally, the AI Navigation package is now available for you to quickly add intelligence to 3D characters and move in game worlds without needing to code rules manually. It also ships with samples to help you get started. See the forum for more details, and check out what’s next on the roadmap.In 2022.2, UI Toolkit is reaching parity with IMGUI for customizing the Editor and is the recommended solution for Editor tools. This means better separation of concerns, more flexible layouts, and advanced stylings. With updates like default inspectors generated with UI Toolkit, ported common built-in Property Drawers, TreeView controls with multicolumn support, and a new vector-drawing API, this release not only helps us reach parity with IMGUI but also supports runtime use cases as well.If you want to learn more about the current state of runtime, we recently released a new project demonstrating a full-feature interface with UI Toolkit based on your feedback for more samples. Check that out here.To help you get started, watch the recent Unite session illustrating a step-by-step example of how to build custom tools with UI Toolkit. Plus, visit the recently released Editor Design system for guidance on how to build intuitive experiences.After extensive work, testing, and listening to a lot of community feedback, DirectX 12 is out of an experimental state with the release of 2022.2. Depending on the project, you can now expect performance on par or greater than DX11, especially in draw call-heavy scenes.This is a result of significant investment into performance and stability, making DX12 the recommended graphics API for Windows and Xbox development. Additionally, DX12 lays the foundation for more advanced graphics features, such as real-time ray tracing, which is now available for Xbox game development. We couldn’t be more excited and thankful to you all for helping us get DX12 across the finish line and look forward to the great games you’ll be creating.We continue to hear that you not only want us to support new platforms, but also where we can simplify and improve development when targeting devices. If you haven’t already, check out the Games Focus blog “Reach more players over multiple platforms and form factors,” where we dive into both what is here now and what will be available in the near future.We’re making cross-device XR creation simpler with Unity XR Interaction toolkit (XRI). XRI provides a framework for common interactions that work across various controllers, such as grab, hover, select, visual feedback to indicate possible interactions on objects, and more. XRI is now in version 2.2, which adds multi-grab support, new locomotion methods, and a collection of ready-to-go Prefabs in our Starter Assets sample package.We recently invited the creators of Blacktop Hoops, a VR basketball game, to talk about how they used XRI as the base for their input controls during the Unite 2022 Keynote. Check out the XR segment for more information.We’ve also updated AR Foundation to version 5.0. This update brings two key features to reduce development time. The first is simulation, allowing you to test your AR app in the Editor using Play mode, an update that addresses a common AR developer frustration in the past. We’ve also added the AR Debug Menu as a new Prefab that you can use to view available configurations on your device and visualize AR subsystem data such as planes and point cloud positions.Finally, we’re continuing to add key platform support to the Editor with Meta Quest Pro, PlayStation®VR2 and Magic Leap 2.To read more about the 2022.2 Tech Stream, check out the release notes for a comprehensive list of features and the Unity Manual for documentation. As you dive in, keep in mind that while each Tech Stream release is supported with weekly updates until the next one, there is no guarantee for long-term support for new features and remember to always back up your work prior to upgrading to a new version. The upgrade guide can also assist with this. For projects in production, we recommend using Unity Long Term Release for stability and support.Each Tech Stream is an opportunity to not only get early access to new features, but also to shape the development of future tech through your feedback. We want to hear how we can best serve you and your projects. Let us know how we’re doing on the forums, or share feedback directly with our product team through the Unity Platform Roadmap. You can also follow us on Twitter and catch our latest Unity Twitch Roundtable, covering 2022.2, on demand.This release completes our 2022 development cycle. We have ambitious goals for next year, which you can read about in our Games Focus series or watch in the recent Unite Roadmap session. Thank you for all your support, and we look forward to partnering with you every step of the way.

>access_file_
1077|blog.unity.com

VR for everyone: Accessible game design tips from Owlchemy Labs

Over a billion people experience disability globally, and many are gamers. 30% of gamers in the U.S. identify as disabled, yet 66% say they face barriers or issues related to gaming.Fortunately, this situation is starting to change.From Tribe Games and Owlchemy Labs to Insomniac and Naughty Dog, studios of all sizes are creating more accessible gaming experiences. Today, 70% of allplayers use accessibility features built into games, whether they have a disability or not. Players want flexibility, and accessible game design can provide that.Owlchemy Labs, the studio behind titles like Job Simulator, Rick and Morty: Virtual Rick-ality, and, most recently, Cosmonious High, champions accessibility in VR. In June, they introduced Cosmonious High’s first accessibility update, with a range of updated gameplay options, including one-handed control mode, features to accommodate seated players, colorblindness enhancements, an immersive subtitling system, and more.Andrew Eiche (chief operating owl and cable slinger), Jazmin Cano (accessibility product manager), and Peter Galbraith (accessibility engineer) joined Unity’s Hasan Al Salman on Twitch to discuss the update.Read on to learn how this innovative studio built a culture of accessibility, get tips you can apply to your own games, or watch the full stream below.Their accessibility statement explains that, “At Owlchemy Labs, we believe deeply in making VR for everyone! Improving our accessibility helps us achieve that goal.” The studio has built a strong, accessibility-first culture that every Owl experiences from their first day of onboarding.“There’s a huge developer documentation page, which is fantastic. It has a fabulous guide on accessibility,” says Jazmin. “There are tools, learning resources, and examples of how Owlchemy approaches games with this thinking. From day one, it shows everyone at Owlchemy how important this is.”Conversations about accessibility at the studio aren’t relegated to specific Slack threads and channels, but are discussed openly everywhere. “It’s really important for everyone to see what’s going on in the industry and even just learn about it as we develop,” says Jazmin.Owlchemy Labs considers every gameplay element through the lens of universal design. Where possible, each feature is built to be used easily by anyone, without having to enable specific accessibility settings from a menu.“There’s a great saying that goes: ‘Design for one, extend to many,’” says Jazmin. “When you create something that’s accessible for one person, it’s likely going to benefit more people than you had in mind.”The team considers accessibility from the start and draws on learnings from previous projects, which makes it easier to implement or iterate on new gameplay features.“We do a lot to think about these things from the beginning as much as we can,” says Andrew. “We’re always improving and getting better, which is why we created the accessibility update. But having the thought process from the beginning makes the whole process significantly easier.”Accessibility options in Cosmonious High generally aren’t hidden in menus. To play in one-handed mode, you can just turn off your second controller and start playing.Peter Galbraith, the team’s accessibility engineer, shares how Owlchemy Labs adapted features like the Powers Menu, the way you select various VR superpowers, for one-handed mode. “Previously, you would have to tap the back of your hand and it would pull up a radial menu of your powers. With the new accessibility update, you can just double tap, and it opens up the menu so you’re good to go.”Players can grab objects telekinetically by gesturing towards them and pulling them with a flick of the wrist. “You don’t have to reopen your hand and get the exact timing when it hits right. It’s a really nice way to make you feel powerful, while making it easy to identify and grab what you want,” Peter says.One-handed mode obviously helps players who don’t have use of both hands, but it has subtle benefits for players who do.“When you design for one use case, you can actually end up solving for a lot of different situations,” says Jazmin. “You can play Cosmonious High while holding a drink, or a snack, or a pet. Maybe only one of your controller’s batteries is charged, so you only have one to play with. If we didn’t have this one-handed mode, in these situations, you just wouldn’t be able to play at all!“During the stream, one viewer asked what makes VR games inaccessible to players who use wheelchairs.“Imagine looking around your own room. All the things that are more than an arm’s length above your head – all of those are inaccessible,” says Andrew. “Imagine if you’re a person who is capable of leaning or moving in your chair, so literally all you can do is stick your arms out in front of you and move them. Those are the kind of things that we have to consider for wheelchair accessibility.”One-handed mode is one way to remove barriers for seated players, but Owlchemy Labs has also implemented other features to ensure players of all heights and abilities can explore the halls of Cosmonious High.For example, every surface in the game functions like a standing desk, with a handle you can adjust to change the height. Players can dynamically change their own height in-game using Small student mode, allowing them to reach areas they might not be able to reach through height sliders and other toggles.Cosmonious High has been praised for its distinctive, colorful visuals. However, Owlchemy Labs was careful to ensure the game remains completely accessible to players with different types of colorblindness.“We have these puzzles where players have to match up different crystals,” explains Peter. “Each has patterns and shapes in addition to colors. Blue triangles connect to blue triangles, yellow squares connect to yellow squares – that way no puzzle or feature is entirely reliant on color alone.”Owlchemy Labs uses Colorblind Effect from the Unity Asset Store to simulate what the scene would look like for players with the three most common types of color blindness. See the tool in action below.Owlchemy Labs has put a huge amount of work and research into the subtitling system for their games, which they believe is among the best in the industry for XR. Cosmonious High’s subtitles are embedded into the HUD, and feature the name, image, and pronouns of the speaker, as well as an arrow pointing in their direction that adjusts based on the player’s position.“The big thing is, unlike television or a 2D view where you can just pop things on the bottom of the screen, we don’t want players to be forced to look at a character when they’re playing,” says Andrew. “But we also want players to know where that character is, so that’s where that little arrow design comes from. We want everyone to have the same level of fidelity that players who hear in the game with the spatialized audio would have.”Owlchemy’s subtitling features ended up being useful for developers, too. “A lot of our developers play without the audio on because they want to listen to music – they just want to hit play, make sure that all the subtitle timings are lined up, and not hear the chaos,” says Andrew. “And now they can do that.”For more information on Owlchemy Labs’ subtitles, check out their talk, “Subtitles in XR: A Practical Framework.”Owlchemy Labs regularly conducts interviews and feedback sessions and performs user testing with the disability community. VR is a physical medium by nature, so in-person testing is ideal. “A player’s body, their range of motion, and their physicality are all important considerations,” says Peter.“In-person feedback is super valuable,” agrees Jazmin. “Not only do you get direct feedback, but you also get feedback through body language. If someone’s scratching their head, maybe they’re a little confused. There’s a lot of nonverbal feedback that you get from meeting with someone.”During the pandemic, Owlchemy Labs began conducting more player research remotely on video calls – now, they do a mix of both. Reaching out and building community online via channels like Discord means they’ve been able to reach even more players in the accessibility community.“There’s a really important saying in the accessibility community: ‘Nothing about us, without us,’” says Jazmin. “It’s important to respect that statement. Listening to people with disabilities provides feedback, and that’s a must. We have to have diverse voices for this work to actually work.”Closing out the stream, Owlchemy Labs offered advice on implementing accessibility features into your projects – and why you should consider it as part of your game design.Jazmin: “Whether you’re making a game right now, or you’re about to launch, or you’ve already launched, it’s never too late to add accessibility. Accessibility is a journey: There’s a lot to learn, a lot to explore and a lot to try out. It’s never too late! For example, Cosmonious High launched before our accessibility update. You can always do more.”Peter: “I echo exactly what Jazmin was saying: It’s never too late to add accessibility. It benefits everyone, not just the people that you assume to have accessibility needs. When you improve accessibility, you are improving your game not just for players with disabilities, but for every one of your players. And the sooner you start thinking about that, the more accessibility features you get to help everyone share virtual reality.”Andrew: “If, for some reason, you need one last bottom-line way of convincing the folks in your organization that accessibility matters, let me remind you that the entertainment market is a very crowded market: Finding a good niche is always a benefit to you. When you cater to a community and you show that you care, that community is going to respond in kind. It’s going to create a market for you, and it’s going to increase the amount of people that are capable of playing your games. But if you want to do it altruistically, which is where, hopefully, Owlchemy expresses itself – we think that it’s a really, really important goal.”Cosmonious High is available now on Meta Quest 2 and SteamVR. To learn how to make your own games accessible, check out our Unity Learn course, Practical Game Accessibility.

>access_file_
1078|blog.unity.com

Create spellbinding visual effects with our advanced VFX guide

The sparks from a magic spell, plumes of smoke, ultraviolet or electric blue energy bolts, city lights seen through mist or rain, open fields of swaying grass... It’s hard to imagine a modern game without the evocative power of visual effects.Visual effects are the key to creating deeply immersive experiences for your players. And thanks to continuous hardware advancements, what used to be available only for Hollywood blockbusters can now be attained in real-time.VFX Graph is one of several major toolsets available in Unity for artists and designers to create with little or no coding. With its node-based visual logic, you can create any number of simple to complex effects for projects across genres.Our new 120-page e-book, The definitive guide to creating advanced visual effects in Unity, guides artists, designers, and programmers using the Unity 2021 LTS version of VFX Graph. Use it as a reference for producing richly layered, real-time visual effects for your games.The VFX Graph creates GPU-accelerated particle systems, and therefore requires compute shader support to maintain compatibility with target devices. It works with theUniversal Render Pipeline (URP, including the 2D Renderer) and the High Definition Render Pipeline (HDRP).Compared to the Built-in Particle System, the VFX Graph can drive more particles with faster simulation, customizable behaviors, extensibility, Camera Buffer access, and native Shader Graph integration. You can use any custom shader created in Shader Graph to target VFX Graph. These shaders are able to use new lighting models like HDRP hair or fabric, and can even modify particles at the vertex level to enable effects like birds with flapping wings, wobbling particles like soap bubbles, and so much more.The VFX Graph e-book is as beautiful to look at as it is inspiring and informative. Created in collaboration with Wilmer Lin, a veteran VFX artist from the film and games industries, and internal experts on the Unity Graphics team, it’s generous in scope, level of detail, thoughtful instruction, images and videos, and numerous downloadable resources and references for VFX authoring in Unity.Let’s take a quick look at what’s in the guide.Get a thorough understanding of each part of the VFX Graph, starting with the VFX Graph Asset and component, and the VFX Graph window. Learn how to create logic with Systems, Contexts, Blocks, Properties, Operators, Blackboards, Subgraphs, Events, Attributes, and more.Visual effects often involve many moving pieces. Connecting them to the correct points in your application is essential to integrating them at runtime. You’ll learn about the available tools for playing back an effect and how to use them:Event Binders:These listen for several different things that happen in your scene and react to specific actions at runtime.Timeline:Sequence visual effects with Activation Tracks to send events to your graph at select moments. Gain precise control with pre-scripted timing (e.g., playing effects during a cutscene).Property Binders: These link scene or gameplay values to the Exposed properties on your Blackboard so that your effects react to changes in the scene, in real-time.Colorful swarms of Particle Strips, explosive effects for a crashing Meteorite, and an extra slimy GooBall: These are just a few of the effects you’ll find in the Visual Effect Graph Samples(HDRP).Each sample highlights different scenarios involving the VFX Graph. For a better understanding, this section of the e-book examines how some of these samples were created, namely through the use of:Shader and VFX Graph togetherGPU Events to trigger other systems in the same graphOrganic movement added to Particle Strips via the Noise Operator, and available Blocks for customizing each Particle Strip’s texture mapping, spawning, and orientationA single graph to drive other graphs in a visual effectA Spawn Context to trigger many other effectsExperimental mesh sampling to fetch data from a mesh and include the result in the graphSee the e-book for more clips that show the different samples, including the following introduction to the GooBall scene.Effects aren’t isolated in a vacuum. Often you’ll need to supply them with external data to achieve your intended look.What if you want the genie to emerge from a magic lamp? Or you’d like to integrate a hologram? While you can accomplish much of this with math functions and Operators, you might need the effect to interact with more complex shapes and forms.This section explains how to use three Data types supported in Unity to enhance your visual effects: Point Caches, Signed Distance Fields, and Vector Fields. Other tools you’ll learn about are the VFXToolbox, which features additional tools for Unity VFX artists, and Flipbook Texture Sheets to bake animated effects into a sprite.Other chapters in the guide cover optimization techniques for visual effects, future developments for VFX Graph, and finally, a long list of tutorials and videos. We’re thrilled to be able to offer you this valuable resource, which is free to download (as all of our technical e-books are). Please don’t hesitate to share your feedback with us in this forum.For a full list of available Unity e-books, check out the How-to hub or browse the documentation under Working in Unity > Best practices guide.

>access_file_
1079|blog.unity.com

The making of Enemies: The evolution of digital humans continues with Ziva

From The Heretic’s Gawain to Louise in Enemies, our Demo team continues to create real-time cinematics that push the boundaries of Unity’s capabilities for high-fidelity productions, with a special focus on digital humans.The pursuit to create ever more realistic digital characters is endless. And since the launch of Enemies at GDC 2022, we have continued our research and development into solutions for better and more believable digital human creation, in collaboration with Unity’s Graphics Engineering team and commercially available service providers specializing in that area.At SIGGRAPH 2022, we announced our next step: replacing the heavy 4D data playback of the protagonist’s performance with a lightweight Ziva puppet. This recent iteration sees the integration of Ziva animation technology with the latest in Unity’s graphics advancements, including the High Definition Render Pipeline (HDRP) – all with the aim of further developing an end-to-end pipeline for character asset creation, animation, and authoring.Along with the launch of a new strand-based Hair Solution and updated Digital Human package, the Enemies real-time demo is now available to download. You can run it in real-time and experience it for yourself, just as it was shown at Unite 2022.While the cinematic may not appear too different from the original, its final rendered version shows how the integration of Ziva technology has brought a new dimension to our protagonist.Ziva brings decades of experience and pioneering research from the VFX industry to enable greater animation quality for games, linear content production, and real-time projects. Its machine learning (ML)-based technology helps achieve extraordinary realism in facial animation, and also for body and muscle deformations.To achieve the level of realism in Enemies, Ziva used machine learning and 4D data capture, which goes beyond the traditional process of scanning actors in 3D scans. The static, uneditable 4D captured facial performance has now been transformed into a real-time puppet with a facial rig that can be animated and adjusted at any time – all while maintaining high fidelity.Our team built on that 4D capture data and trained a machine-learned model that could be animated to create any performance. The end result is a 50 MB facial rig that has all the detail of the 4D captured performance, without having to carry its original 3.7 GB of weight.This technology means that you can replicate the results with a fraction of the animation data, creating real-time results in a way that 4D does not typically allow.In order to achieve this, Unity’s Demo team focused on:Creating the puppetTo create this new version of Louise, we worked with the Ziva team. They handled the machine learning workflow using a preexisting 4D data library. Additional 4D data was collected from a new performance by the original Enemies actor (we only needed to collect a few additional expressions). This is one of the unique advantages of our machine learning approach.With this combined dataset, we trained a Ziva puppet to accurately reproduce the original performance. We could alter this performance in any way, ranging from tweaking minute details to changing the entire expression.Using the 4D data capture through machine learning, we could enable any future performance to run on any 3D head by showing a single performance applied to multiple faces of varying proportions. This makes it easier to expand the range of performances to multiple actors and real-time digital humans for any future editions.The puppet’s control schemeOnce the machine learning was completed, we had 200–300 parameters that, when used in combination and at different weights, could recreate everything we had seen in the 4D data with incredible accuracy. We didn’t have to worry about a hand-animated performance looking different when used by a group of different animators. The persona and idiosyncrasies of the original actor would come through no matter how we chose to animate the face.As Ziva is based on deformations and not an underlying facial rig, we could manipulate even the smallest detail because the trained face uses a control scheme that was developed to take advantage of the fidelity of the machine-learned parameters/data.At this point, creating a rig is a relatively flexible process as we can just tap into those machine-learned parameters – this, in turn, deforms the face. There are no joints in a Ziva puppet, besides the basic logical face and neck joints.So what does this all mean?There are many advantages to this new workflow. First and foremost, we now have the ability to dynamically interact with the performance of the digital human in Enemies.This allows us to change the character’s performance after it has already been delivered. Digital Louise can now say the same lines as before, but with very different facial expressions. For example, she can be friendlier or angrier or convey any other emotion that the director envisions.We are also able to manually author new performances with the puppet – facial expressions and reactions that the original actress never performed. If we wanted to develop the story into an interactive experience, it would be important to expand the possibility of what the digital character reacts to, such as a player’s chess moves, with nuances of approval or disapproval.For the highest level of fidelity, the Ziva team can even create a new puppet with its own 4D dataset. Ziva also recently released a beta version of Face Trainer, a product built on a comprehensive library of 4D data and ML algorithms. It can be used to train any face mesh to perform the most complex expressions in real-time without any new 4D capture.Additionally, it is possible to create new lines of dialogue, all at a fraction of the time and cost that the creation of the first line required. We can do this either by getting the original actress to perform additional lines with an HMC and then using the HMC data to drive the puppet, or by getting another performer to deliver the new lines and retargeting their HMC data to the existing puppet.At SIGGRAPH Real-Time Live! we demonstrated how to apply the original performance from Enemies to the puppet of another actress – ultimately replacing the protagonist of the story with a different person, without changing anything else.This performance was then shown at Unite 2022 during the keynote (segment 01:03:00), where Enemies ran on an Xbox Series X, with DX12 and real-time ray tracing.To further enhance the visual quality of Enemies, a number of HDRP systems were leveraged. These include Shader Graph motion vectors, Adaptive Probe Volumes (APV), and of course, hair shading.Enemies also makes use of real-time ray tracing in HDRP and Unity’s native support for NVIDIA DLSS 2.0 (Deep Learning Super Sampling), which enable it to run at 4K image quality, comparable to native resolution. All of these updated Unity features are now available in Unity 2022 LTS.The brand new strand-based Hair Solution, developed during the creation of the Enemies demo, can simulate individual hairs in real-time. This technology is now available as an experimental package via GitHub (requires Unity 2020.2.0f1 or newer), along with a tutorial to get started.By integrating a complete pipeline for authoring, simulation, shading, and rendering hair in Unity, this solution is applicable to digital humans and creatures, in both realistic and stylized projects. The development work continues with a more performant solution for hair rendering enabled by the upcoming Software Rasterizer in HDRP. We are also diversifying the authoring options available by adopting and integrating the Wētā Wig tool for more complex grooms, as showcased in the Lion demo.Expanding on the technological innovations from The Heretic, the updated Digital Human package provides a realistic shading model for the characters rendered in Unity.Such updates include:A better 4D pipelineA more performant Skin Attachment system on the GPU for high-density meshesMore realistic eyes with caustics on the iris (available in HDRP as of Unity 2022.2)A new skin shader, built with the available Editor technologyTension tech for blood flow simulation and wrinkle maps, eliminating the need for a facial rigAnd as always, there is more to come.Discover how Ziva can help bring your next project to life. Register your interest to receive updates or get early access to future Ziva beta programs. If you’d like to learn more, you can contact us here.

>access_file_