// transmission.log

Data Feed

> Intercepted signals from across the network — tech, engineering, and dispatches from the void.

1692 transmissions indexed — page 26 of 85

[ 2025 ]

20 entries
504|blog.unity.com

Scaling industrial AR: How Visometry uses Unity and model tracking on the factory floor

From precision tracking to quality inspection, Visometry and Unity are enabling next-gen augmented reality across manufacturing and engineering.Visometry GmbH is a deep-tech company based in Germany, specializing in enterprise-grade augmented reality (AR) solutions for industrial use. A spin-off from the renowned Fraunhofer Institute, Visometry is best known for its powerful model tracking SDK, VisionLib, which enables precise, real-time object tracking in complex industrial environments. From automotive to mechanical engineering, Visometry’s tools help companies deploy scalable AR applications that drive efficiency, quality, and innovation.In this article, Visometry explores how they are leveraging Unity’s real-time 3D engine to bring advanced tracking and 3D visualization to life — from inspection systems like Twyn to immersive interactive experiences, and enabling Visometry to streamline industrial AR development and make high-performance solutions accessible to developers, integrators, and manufacturers alike. Industrial AR today: From pilot to productionAR has quietly transformed the industrial world. Over the past decade, industrial AR has progressed from experimental trials to mission-critical tools that are reshaping how companies inspect, maintain, and interact with complex machinery and infrastructure.As the sector confronts new challenges—rising demand, an aging workforce, and the constant push for precision—AR offers compelling answers to improve operational efficiency, reduce errors, and make smarter use of skilled labor.Visometry has been part of this evolution from the beginning, leveraging Unity’s real-time 3D engine to bring advanced tracking and 3D visualization to life. Originally spun out of the Fraunhofer Institute, the company today focuses on a critical layer of industrial AR: enterprise-ready object tracking. This core technology enables the precise alignment of 3D content and data layers with real-world machines and components. Enabling precise AR with VisionLibAt the heart of many AR solutions is VisionLib, a software development kit (SDK), built on Unity, that brings industrial-strength object tracking to AR applications. Already in use by leading companies across automotive and mechanical engineering sectors, VisionLib provides stable, real-time, markerless tracking of one or more physical objects—even under challenging lighting or motion conditions.Using CAD and 3D data, VisionLib enables high-precision pose estimation that anchors digital content directly onto real-world components. This technique, known as model tracking, has become a foundational capability for industrial AR, enabling developers to overlay visual data with accuracy and without manual alignment or markers. Unlike consumer-grade AR, which typically relies on simultaneous localization and mapping (SLAM) -based camera tracking and can suffer from content drift, model tracking ensures precise placement of 3D elements even in dynamic or reflective environments. That level of precision is key for industrial use cases such as quality inspection, guided assembly, and training.From SDK to solution: Twyn for AR-based quality inspectionWhile VisionLib is designed for developers and platform integrators, Visometry also brings AR directly to manufacturing teams through Twyn — a ready-to-use AR inspection solution.Twyn enables interactive, CAD-based quality inspection on the shop floor, allowing teams to detect deviations between digital models and physical parts quickly and flexibly. By combining VisionLib’s tracking with Unity’s visualization engine, Twyn delivers enterprise-ready performance in an intuitive interface, even for teams without deep expertise in 3D graphics or AR development.This approach exemplifies Visometry’s mission: democratize access to industrial-grade AR by abstracting complexity and enabling other organizations to offer offering scalable solutions that work out of the box. Why Unity: Developer speed and cross-platform reachFrom the outset, VisionLib has been designed with developers in mind. Leveraging Unity's plugin architecture, VisionLib SDK seamlessly integrates into the Unity ecosystem, enabling developers to create industrial-grade AR applications with precise object tracking and robust anchors for digital content, all managed directly from the Unity tech stack.Unity has long played a pivotal role in democratizing model tracking, particularly through its ARFoundation framework, which bridges platform-specific AR modules like ARKit and ARCore.Building on this foundation, VisionLib SDK extends Unity's capabilities to meet the demands of industrial applications. Designed for enterprise-grade augmented reality, VisionLib combines CAD data with advanced image processing to deliver true 3D object tracking for mixed and augmented reality scenarios.Unity developers can harness this cutting-edge technology directly within the Unity Editor to craft high-quality XR experiences. Moreover, VisionLib SDK integrates seamlessly with ARFoundation, enhancing interoperability and expanding its reach. This streamlined workflow empowers developers to rapidly prototype, efficiently implement AR features, and deploy across a wide range of devices, including smartphones, tablets, AR glasses, and stationary industrial hardware. Real-world AR: A closer look at Atelier MarkgraphOne standout industrial use case is Atelier Markgraph, a design studio specializing in spatial communication. For years, they’ve been using AR to stage products and complex systems in dynamic, interactive ways—often at high-profile events.Working with Mercedes-Benz, Markgraph created AR-driven experiences to showcase vehicles and engines. Rather than display a static engine block, they used VisionLib and Unity to deliver layered, explorable, and interactive 3D visualizations.“It’s super complex to physically cut through an engine,” explains Christoph Diederichs, Head of Interactive Experiences at Markgraph. And it’s difficult to convey engineering finesse just by presenting a block of steel – it’s not interactive, it’s not dynamic.”According to Diederichs, AR makes it possible to add an interactive digital 3D layer to the real world, enabling spectators “to grasp” things that are otherwise hard to communicate in a short amount of time. Tracking and interacting with a real object delivers on the “tangible feel” and the appeal of a real engine, while AR is virtually making the invisible visible. “In our industry, time is always a critical factor,” says Diederichs. Especially during the final stages of a project, when everything needs to come together under pressure: 3D models need to be swapped, last-minute change requests incorporated."At one event, there was virtually no time for testing: …and it just had to work”, Diederichs recalls. The new engine, still a closely guarded secret until its public unveiling, could only be viewed and tested once in advance. The rest had to be developed under lab conditions only. The future of industrial AR: From guidance to validationAccording to Jens Keil, founder and product manager at Visometry, AR in industrial settings is entering a new phase. The early days were about enabling tracking. Then came platforms for scaling content. Now, there is a shift toward using AR not just for guidance, but for verification.That means going beyond simply showing what to do—and instead confirming whether a task was performed correctly. VisionLib is already ahead of the curve with multi-model tracking, which can independently track and validate multiple objects or subcomponents within a larger assembly.In one example, Visometry highlights how its tracking tech automatically detects and flags misaligned components on steel structures. Combined with advances in AI and machine vision, this approach moves AR toward more intelligent, closed-loop validation tools.“We’re exploring ways to enable users to combine AR, AI, and other techniques,” says Keil. “Our goal is to democratize access through simple developer tools and platform-independent solutions.”Visometry is not just creating tools for today’s AR but building a foundation for the next generation of spatial computing—where AR, AI, and computer vision come together to empower industry-wide transformation.

>access_file_
508|blog.unity.com

Games made with Unity: June 2025 in review

Plenty of successful games are made with Unity, and the list keeps growing. PEAK, from Landfall and Aggro Crab, sold over 2 million copies in just 9 days. Big releases like Broken Arrow and the 1.0 launch of Len’s Island show how teams of all sizes are shipping ambitious games with Unity. Here's a look at some of the most recent ones. Also, don't miss some of the great trailers from the summer showcases, or great demos that debuted at Next Fest this month.IGF Awards Huge congrats to all the IGF finalists, especially the games made with Unity that dominated the awards this year — including Consume Me, which took home three wins! Fresh off their Audience Award win at the IGF Awards, The WereCleaner team joined us on stream. Check it out:Made with Unity Steam Curator Page Once again we sent out a clarion call for Unity staff to share which of your games they've been playing this past month. Be sure to see them all on our Steam Curator Page here:Working on a game in Unity? We’d love to help you spread the word. Be sure to submit your project.Without further ado, to the best of our abilities, here’s a non-exhaustive list of games made with Unity and launched in June 2025, either into early access or full release. Add to the list by sharing any that you think we missed.ActionShotgun Cop Man, DeadToast Entertainment (May 1)Deliver At All Costs, Studio Far Out Games (May 22)Pipistrello and the Cursed Yoyo, Pocket Trap (May 28)Bullet HeavenBioprototype, Emprom Game (May 19)Broventure: The Wild Co-op, Alice Games (May 15)Tower of Babel: Survivors of Chaos, NANOO (May 19 – early access)Cards, dice, and deckbuildersMonster Train 2, Shiny Shoe (May 21)Into the Restless Ruins, Ant Workshop Ltd (May 15)Casual, rhythm, and partyWordatro!, Le Poulet (June 23)POPUCOM, Hypergryph (June 1)City and colony builderKity Builder, Sambero, irx99, YerayToledano, Juan Hust (June 17)MEMORIAPOLIS, 5PM Studio (April 30)ComedyPick Me Pick Me, Optillusion (May 28 – early access)Experimental or surrealistENA: Dream BBQ, ENA Team (March 27)FPSBloodshed, com8com1 Software (May 22)GRIMWAR, BookWyrm (May 16)Noga, Ilan Manor (May 30)HorrorLiDAR Exploration Program, KenForest (April 2)White Knuckle, KenForest (April 17 – early access)The Boba Teashop, Mike Ten (April 21)Out of Hands, Game River (April 22)Darkwater, Targon Studios (April 22 – early access)Management and automationSupermarket Simulator, Nokta Games (June 19)Plant Nursery Simulator, Robot Assembly (June 16 – early access)MetroidvaniaOirbo, ImaginationOverflow (February 11 – early access)SteamDolls - Order Of Chaos, The Shady Gentlemen (February 11 – early access)Narrative and mysteryReplicomica, enyevg (June 16)Duck Detective: The Ghost of Glamping, Happy Broccoli Games (May 22)Beholder: Conductor, Alawar (April 23)PlatformerPEAK, Landcrab (June 16)Blessed Burden, Podoba Interactive (June 18)Once Upon A Puppet, Flatter Than Earth (April 23)PEPPERED: an existential platformer, Mostly Games (April 7)Ninja Ming, 1 Poss Studio (April 10)Seafrog, OhMyMe Games (April 15)Puzzle adventureSqueakross: Home Squeak Home, Alblune (June 7)Strings Theory, Beautiful Bee (Console release)Kathy Rain 2: Soothsayer, Clifftop Games (May 20)Poco, Whalefall (May 20)Axona, Onat Oke (May 28)Projected Dreams, Flawberry Studio (May 29)Elroy and the Aliens, Motiviti (April 2)Leila, Ubik Studios (April 7)Tempopo, Witch Beam (April 17)BOKURA: planet, ところにょり (April 24)Amerzone - The Explorer's Legacy, Microids Studio Paris (April 24)Roguelike/liteLost in Random: The Eternal Die, Stormteller Games (June 17)Nightmare Frontier, Ice Code Games (June 16)RPGRAIDOU Remastered: The Mystery of the Soulless Army, ATLUS (June 19)Yaoling: Mythical Journey, RAYKA STUDIO (June 19)BitCraft Online, Clockwork Laboratories, Inc. (June 21 – early access)SandboxA Webbing Journey, Fire Totem Games (May 19 – early access)Islands & Trains, Akos Makovics (May 29)SimulationCast n Chill, Wombat Brawler (June 16)Liquor Store Simulator, Tovarishch Games (May 2)Doloc Town, RedSaw Games Studio (May 7)Tales of Seikyu, ACE Entertainment (May 21 – early access)Trash Goblin, Spilt Milk Studios Ltd (May 28)Sports and drivingThe Last Golfer, Pixel Perfect Dude (May 28)Turbo Takedown, Hanging Draw (March 3)StrategyBroken Arrow, Steel Balalaika (June 19)Tower Dominion, Parallel 45 Games (May 7)9 Kings, Sad Socket (May 23 – early access)SurvivalLen’s Island, Flow Studio (June 19)Survival Machine, Grapes Pickers (May 7 – early access)Oppidum, EP Games® (April 25)That’s a wrap for June 2025. Want more Made with Unity and community news as it happens? Don’t forget to follow us on social media: Bluesky, X, Facebook, LinkedIn, Instagram, YouTube, or Twitch.

>access_file_
510|blog.unity.com

Discover data-driven stability and enhancements in 6.2 beta

Our commitment for Unity 6 is to provide a faster, more reliable and more stable engine. In our latest Unity 6.2 beta release, we’re introducing updates that will help us identify and resolve performance issues with greater speed and accuracy, as well as a new, built-in diagnostics experience for developers to improve game performance on a project-specific level. These updates are enabled by a new developer data framework we’ve introduced with this beta, which is designed to give developers more visibility and control over how their data is shared and used across the Unity ecosystem.New Diagnostics: Enhanced Observability for Smoother GameplayUnity 6.2 introduces enhanced diagnostics features that significantly improve observability and device performance monitoring. With this update, developers gain access to more robust crash reporting capabilities, enabling them to diagnose and resolve performance issues more effectively than before. These reports provide a clearer picture of how games perform across a diverse range of devices, helping developers ensure smoother gameplay for more players across more devices.In your Project Overview in Unity Dashboard, you can now view diagnostics reports to help you monitor, investigate, and resolve crashes and performance issues that your players may be experiencing. The experience includes new and more detailed data points for mobile and desktop projects, including ANR (Application Not Responding) monitoring with device and session details for Android projects, as well as new data visualization options to make it easier to view and understand performance trends.No additional package installation is required to get started, and these enhanced diagnostics are freely available to our developers and not as an additional paid service.Addressing Runtime Issues at Scale The availability of diagnostics data doesn’t just benefit individual developers—it also powers Unity’s internal efforts to identify and address critical engine issues at scale. Improving the performance and stability of the engine requires real, timely insight into how the engine behaves in production – on actual devices, in real-world gameplay. Powering that insight is real-time diagnostics data we use to identify the most impactful fixes. By making the runtime smart about its own performance across all the environments Unity operates in, we’re able to more quickly identify critical issues as they occur, and get solutions in the hands of our users much sooner.This data is essential to ensuring we can continue investing in making the engine as stable and performant as possible, and starting in this 6.2 beta, all new projects created will collect diagnostics data by default. Developers who want to opt out can do so at any time in Project Settings in the Editor.Introducing the Developer Data FrameworkAs we introduce more data-driven improvements to our own engine and runtime - as well as the tools and services that allow developers to do more with their data - transparency and control become even more important. That’s why we’re introducing a new framework that ensures that developers have the final say on how and where their data is used in the Unity ecosystem.The Developer Data framework is Unity’s new approach to data collection, management, and usage across our ecosystem. It empowers you with tools to control Developer Data collected on your behalf – from how it's gathered, to where it goes, and how it's used.Whether data comes from the Unity engine, Unity services, or customer-owned sources, it's considered Developer Data – data you own and control. Unity uses it only as directed and never repurposes it without your explicit permission. You define the rules, so you can rest easy knowing exactly what is shared and how it is used every step of the way.In the latest 6.2 release, you will now see Developer Data as a settings option in the Unity Dashboard, providing a scalable way to manage your data preferences across projects and services. Until you customize settings, the use of your data is limited to only what’s required to provide to you the products and services you already use. You can adjust these settings at any time, and they will automatically deploy to both current projects and live games.Ultimately, you control what data is collected, how it's shared, and how it can be used to power capabilities like machine learning, benchmarking, support, personalized recommendations, and more. This includes governance around how your data can be used to power Unity AI. As a reminder, data related to your use of Unity AI, including prompts, responses and interactions, will not be used to train AI models unless you enable the Unity AI Developer Data sharing setting for “Improve Unity AI models” in the AI Unity Dashboard.The framework helps you align your data strategy with your goals, workflows, and privacy standards – all while unlocking better insights, smoother experiences, and stronger outcomes. To read more about the Developer Data framework, you can see an in-depth overview here. Discover new features in the Unity 6.2 beta todayUnity 6 beta releases allow us to share new features faster, so that our community has more opportunities to provide feedback that will help shape the future of our releases.These releases are open to everyone, and you can get started by downloading the latest release for the Unity Hub. Because there may be feature stability issues with early test versions, we do not recommend you use beta releases for projects in production, and we highly recommend that you back up any project before opening it with an alpha or beta release.Connect with our team and provide feedbackYou can find a dedicated section in Unity Discussions for this beta update, where you'll find links to detailed documentation and, most importantly, the best place to leave your feedback for our team.

>access_file_
514|blog.unity.com

How dialog-driven video layering shapes the surreal world of ENA: Dream BBQ

ENA: Dream BBQ adapts Joel G’s cult-following animated web series into a surreal interactive adventure game. The game follows ENA as she searches for the mysterious Boss, playing missions and meeting oddball characters in strange worlds created with a mishmash of trippy textures and techniques.We sat down with the game’s producer, technical director, and tech artist, Evan Nave, and Luke Mirman, who oversees programming, to talk about how a lean team brought this stunningly original (and overwhelmingly popular) game to life.Let’s start by talking a bit about the game and what you were trying to do.Evan Nave: It’s an interactive episode of Joel G’s ENA animation series. Since we already used the Unity Engine for animations, it felt natural to extend that out into a fully playable game. Playing as the titular character, ENA, you interact with the world, meet characters, and do things that you’re told to do. Nothing crazy gameplay-wise, but it falls into the vein of LSD Dream Emulator and similar kinds of surreal games.Why did you decide to bring the web animation series to a video game? How did that come about?Evan: It started small. Joel is primarily a 2D animator. Adobe Animate was his main medium for creating animations with some 3D mixed in for a surreal style.Luke and I made a game in college for one of our finals, Somnium, which was inspired by Joel’s animation series. We showed that to him, and he brought us onto the team for ENA since it was clear that we shared a similar artistic vision. That Unity project in college led us into making a full game with Joel.Let’s talk about interactions – there’s a lot of complex video layering and superimposition of transparencies in the game, and much of that gets synchronized through dialog systems, right?Evan: Because the animations could take on many different forms, we had a lot of different systems.Luke Mirman: There’s several different ways to run a dialog interaction, but it’s not convoluted just for the sake of it. They all serve different purposes. If you want to have a cut scene versus a player-directed dialog, they have to behave differently, so they are separate systems. But they all have the same underlying functionality.How do they work?Luke: There’s one that is, I would call it “pure dialog,” which is lines of dialog where the player advances from one line to the next. That’s the most typical interaction.The second type is what I’d call “director-driven dialog,” which is similar, but the player has no agency over it. Usually a video is adding timings to dialog lines in the dialog script itself and is syncing with that video. So it would play the video, play the audio for it, and when it sees that the video has progressed a certain amount of time, it shows the next line.The third I would call “timeline sequences,” or cut scenes, which we created with custom video tracks and custom dialog tracks. While the timeline controls the animations and camera motions of the sequence, under the hood the video track is driving the timeline. It’s not in the update cycle – it’s driven by the video player. That’s for a specific reason: If the player system is stuttering the video, we don’t want it to be constantly playing catchup. So it’s really running the timeline in the timing of the video itself, which enabled us to do more sophisticated things in the timeline.What were the challenges of working with so many different systems?Evan: From a tech director perspective, it was very interesting because Joel would come to us with an idea saying, “I have this vision where this happens in the foreground, this happens in the background. The camera then pans, and then another 2D sequence happens on the screen.” So we’d need to plan around the in-game world by matching a pre-rendered or pre-animated video that has transparency. And that’s why directing the timeline is important, because if the camera pans and that pan is matched in the 2D animation, we need the animation to direct the Unity timeline.A lot of it came down to Joel saying, “Hey, this is my idea. Here’s a storyboard or animatic of it. Is it possible?” And then I would go through it, talk to Luke and our artists, and detail out how we could do it in Unity.Luke: I think one of my favorite things about working on this project is that Joel doesn’t have much technical experience with Unity or programming. And that actually was really exciting because he would propose things that are absurd.The best example that’s visible to players is the inventory. Having it be a newspaper that you look through and see your current jobs as if they were advertisements is counterintuitive to how you would typically approach a UI – usually you have a menu with a scroll box, just something functional.When Joel showed me the concept for swapping through the papers, I was so excited. But if you come from a typical background of what you would expect a user experience in a menu to be, you’d probably play it straight and narrow.Evan: Joel would also propose different art assets or visual styles for systems where we could have reused assets. So there is a lot of extra art that you wouldn’t typically see in a video game. We didn’t want to reuse assets or copy-paste anything. We have a really awesome team who just wants to do neat things, and Joel is a great director who lets each person put in their own thing. That was a special aspect of the game.It’s a really unique game that required a lot of unorthodox approaches. What are some of the workarounds or solves that you’re most proud of?Evan: I got to make some non-Euclidean things for levels, and that was really fun.We also had a 3D skybox system. I made this back in college for our game Somnium. The 3D skybox system is like the source engine where you have a little box somewhere else in the map, and that’s projected behind all the geometry so your render distance doesn’t have to be as far. We played around with that, so not only is it a functional skybox, but we could also project that onto geometry in the level. We could make a whole wall disappear and look like there’s stuff behind it when really we’re just hiding something – like a big curtain.Non-Euclidean stuff was definitely interesting. ENA is very surreal and nonsensical, so it lends itself to playing with space. There are some skyboxes that are just completely hand-painted where we just do nothing but have a textured sphere around the player, that makes it look like there is geometry infinitely far away.Luke: I’m really proud of how flexible the dialog system is. We use Yarn Spinner as our dialog system. I think the amount of our documentation for random niche purposes is frankly the part I’ve had the most fun working on. It’s enabling our artists to write dialog that implements what the vision for sequences is.It goes back to Joel not necessarily having that much technical experience with game development. He would propose these sequences, and it was such a fun challenge making that work. Sometimes there would be compromises based on what’s reasonable, of course, but incorporating that vision with a unique perspective on implementing things was really fun.Check out ENA: Dream BBQ on Steam.

>access_file_
516|blog.unity.com

Make user churn history with Aura Remarketing

User churn is one of the biggest challenges app developers face today—90% of users stop engaging within the first 30 days of downloading an app, and even the most popular apps struggle with retention rates that exceed 80%.* That’s why re-engaging inactive users isn’t just important, it’s essential. It’s about connecting with people who’ve already shown interest in your app and reminding them why they downloaded it in the first place.This is where Aura Remarketing comes in. Aura provides a direct and effective way to bring users back to your app through smart, targeted notifications that are delivered natively on your users’ devices. By focusing on high-value users, like those who make in-app purchases or play frequently, Aura drives meaningful reactivations and helps you grow your active user base without spending on new user acquisition campaigns.Why winning back users makes senseInactive users aren’t just lost opportunities, they can be some of the most valuable audiences to connect with. They’ve already downloaded your app, indicated intent, and engaged at one point. By targeting them at the right moment with placements that are native and non-intrusive, Aura Remarketing enables you to:Bring back users at the moments they’re most likely to re-engageDiversify your UA strategyFocus on high-value users who have already shown interest in your appHow does it work?Aura Remarketing uses smarter targeting that helps you reconnect with users in a strategic and impactful way. This ensures the right users receive the right message at the right time. Our targeting includes:User type: Target users who have downloaded through Aura or through other means.Demographics: Customize targeting based on their opt-in first party data, such as age, gender, location, etc.Device models: Target users who own specific devices (users with high-end devices tend to spend more on apps**).The targeting gets even better if your app was originally downloaded with Aura. In this case, you can target via:Download recency: Set a customizable minimum number of days since the app was downloaded to ensure recently installed users aren’t targeted.App engagement: Focus on users based on whether they have opened and interacted with the app after downloading.Post-install behavior: Establish targeting rules based on post-install activities like app launches, purchases, ad interactions, app opens, registrations, or first-time use, with the option to define specific timeframes for these actions.What’s in it for you?Aura Remarketing combines the native and non-intrusive nature of on-device campaigns with the effectiveness and cost efficiency of remarketing campaigns. It also works at scale across multiple OEM and telco partners, uses deep targeting and personalization techniques, and supports sending notifications directly to users through device-level channels. This provides you with the tools to:Re-engage high-value users: Bring back users who make significant purchases or are valuable to the app's ecosystem.Leverage targeted insights: Use knowledge of user behavior within the app to create more precise and effective campaigns.Maximize investment: Reactivate users you've already acquired to grow your active user base without additional acquisition costs.Tap into high intent: Users who have already downloaded the app are more likely to return and engage.Revive your relationship with usersAura Remarketing is built to support re-engagement campaigns to deliver real, measurable results. This helps app publishers reconnect with high-value users, improve retention, and maximize ROI.Ready to re-engage and reignite your user base? Contact us today to learn how Aura Remarketing can help you bridge the gap between disengaged users and long-term retention. FAQS1. What is Aura Remarketing?Aura’s on-device remarketing solution reactivates users directly within their mobile device, offering the tools to boost engagement and bring them back to your app effortlessly.2. How do I get started with a remarketing campaign?To launch a remarketing campaign, you'll need to provide the following:A campaign performance link for measuring results.A redirect link to guide users back to the relevant page or feature in your app.Criteria for the users you want to re-engage.Your desired bid for the campaign.Key Performance Indicators (KPIs) to measure campaign success.3. What placements are available?Currently, notifications are available. We are exploring additional placements, including Game Spotlight and In-life App Discovery Experiences, for possible inclusion in future updates.4. What are Aura Remarketing’s capabilities?Smart Notifications: Re-engage inactive users directly through on-device notifications.Advanced Targeting: Segment users by downloads, demographics (age, gender, device), app engagement (launches, purchases, ads viewed), and post-install behavior.Custom Timeframes: Set time-based conditions, like minimum days since download or specific post-install actions.Scalable Campaigns: Drive high-performance, tailored campaigns at scale with minimal setup effort.5. Are there any privacy requirements?At this time, no additional privacy steps are needed on your end. Our platform is carrier-grade compliant and adheres to applicable data privacy and security requirements. *Source: App Retention Rates (2025), Business of Apps, February 6, 2025. **A “high-end device” refers to a smartphone or tablet that includes a high-performance processor, substantial RAM and ROM, an advanced display, long-lasting battery, high-quality camera, and functional sensors such as touchscreen, proximity, and accelerometer.

>access_file_
519|blog.unity.com

Making a killing: The playful 2D terror of Psycasso®

A serial killer is stalking the streets, and his murders are a work of art. That’s more or less the premise behind Psycasso®, a tongue-in-cheek 2D pixel art game from Omni Digital Technologies that’s debuting a demo at Steam Next Fest this week, with plans to head into Early Access later this year. Playing as the killer, you get a job and build a life by day, then hunt the streets by night to find and torture victims, paint masterpieces with their blood, then sell them to fund operations.I sat down with lead developer Benjamin Lavender and Omni, designer and producer, to talk about this playfully gory game that gives a classic retro style and a fresh (if gruesome) twist.Let’s start with a bit of background about the game.Omni: We wanted to make something that stands out. We know a lot of indie studios are releasing games and the market is ever growing, so we wanted to make something that’s not just fun to play, but catches people’s attention when others tell them about it. We’ve created an open-world pixel art game about an artist who spends his day getting a job, trying to fit into society. Then at nighttime, things take a more sinister turn and he goes around and makes artwork out of his victim's blood.We didn’t want to make it creepy and gory. We kind of wanted it to be cutesy and fun, just to make it ironic. Making it was a big challenge. We basically had to create an entire city with functioning shops and NPCs who have their own lives, their own hobbies. It was a huge challenge.So what does the actual gameplay look like?Omni: There’s a day cycle and a night cycle that breaks up the gameplay. During the day, you can get a job, level up skills, buy properties and furniture upgrades. At nighttime, the lighting completely changes, the vibe completely changes, there’s police on the street and the flow of the game shifts. The idea is that you can kidnap NPCs using a whole bunch of different weapons – guns, throwable grenades, little traps and cool stuff that you can capture people with.Once captured on the street, you can either harvest their blood and body parts there, or buy a specialist room to keep them in a cage and put them in various equipment like hanging chains or torture chairs. The player gets better rewards for harvesting blood and body parts this way.On the flip side, there’s a whole other element to the game where the player is given missions each week from galleries around the city. They come up on your phone menu, and you can accept them and do either portrait or landscape paintings, with all of the painting being done using only shades of red. We've got some nice drip effects and splat sounds to make it feel like you’re painting with blood. Then you can give your creation a name, submit it to a gallery, then it goes into a fake auction, people will bid on the artwork and you get paid and large amount of in-game money so you can then buy upgrades for the home, upgrade painting tools like bigger paint brushes, more selection tools, stuff like that.Ben: There’s definitely nothing like it. And that was the aim, is when you are telling people about it, they’re like, “Oh, okay. Right. We’re not going to forget about this.” Let’s dig into the 2D tools you used to create this world.Ben: It’s using the 2D Renderer. The Happy Harvest 2D sample project that you guys made was kind of a big starting point, from a lighting perspective, and doing the normal maps of the 2D and getting the lighting to look nice. Our night system is a very stripped-down, then added-on version of the thing that you guys made. I was particularly interested by its shadows. The building’s shadows aren’t actually shadows – it’s a black light. We tried to recreate that with all of our buildings in the entire open world – so it does look beautiful for a 2D game, if I do say so myself.Can you say a bit about how you’re using AI or procedural generation in NPCs?Ben: I don’t know how many actually made it into the demo to be fair, number-wise. Every single NPC has a unique identity, as in they all have a place of work that they go to on a regular schedule. They have hobbies, they have spots where they prefer to loiter, a park bench or whatever. So you can get to know everyone’s individual lifestyle.So, the old man that lives in the same building as me might love to go to the casino at nighttime or go consistently on a Monday and a Friday, that kind of vibe.It uses the A* Pathfinding Project, because we knew we wanted to have a lot of AIs. We’ve locked off most of the city for the demo, but the actual size of the city is huge. The police mechanics are currently turned off, but there’s 80% police mechanics in there as well. If you punch someone or hurt someone, that’s a crime, and if anyone sees it, they can go and report to the police and then things happen. That’s a feature that’s there but not demo-ready yet.How close would you say you are to a full release?Omni: We should be scheduled for October for early access. By that point we’ll have the stealth mechanics and the policing systems polished and in and get some of the other upcoming features buttoned up. We’re fairly close.Ben: Lots of it’s already done, it’s just turned off for the demo. We don’t want to overwhelm people because there’s just so much for the player to do.Tell me a bit about the paint mechanics – how did you build that?Ben: It is custom. We built it ourselves completely from scratch. But I can't take responsibility for that one – someone else did the whole thing – that was their baby. It is really, really cool though.Omni: It’s got a variety of masking tools, the ability to change opacity and spacing, you can undo, redo. It’s a really fantastic feature that gives people the opportunity to express themselves and make some great art.Ben: And it's gamified, so it doesn’t feel like you’ve just opened up Paint in Windows.Omni: Best of all is when you make a painting, it gets turned into an inventory item so you physically carry it around with you and can sell it or treasure it.What’s the most exciting part of Psycasso for you?Omni: Stunning graphics. I think graphically, it looks really pretty.Ben: Visually, you could look at it and go, “Oh, that’s Psycasso.”Omni: What we’ve done is taken a cozy retro-style game, and we’ve brought modern design, logic, and technology into it. So you're playing what feels like a nostalgic game, but you're getting the experience of a much newer project.Check out the Psycasso demo on Steam, and stay tuned for more NextFest coverage.

>access_file_