// transmission.log

Data Feed

> Intercepted signals from across the network — tech, engineering, and dispatches from the void.

1689 transmissions indexed — page 75 of 85

[ 2019 ]

7 entries
1482|blog.unity.com

What is app-ads.txt?

Recently, we saw ads.txt adoption spread quickly across the industry’s desktop and web inventory, with most savvy brand marketers and demand side platforms now refusing to buy inventory that doesn’t have ads.txt implemented.Today, as more brands make the move towards in-app advertising, the time has come for ads.txt to make its way to mobile in-app - and this is exactly what’s happening with ads txt for apps. This is an important step towards eliminating certain types of fraud, and improving the transparency and efficiency of the overall ecosystem.However, before delving into the intricacies of app-ads.txt, it’s important to understand where exactly it comes from.Ads.txtAds.txt is a text file publishers add to their websites, which lists the ad sources authorized to sell their inventory.In 2017, the Interactive Advertising Bureau (IAB) Tech Lab introduced the ads.txt tool with the aim of preventing sales of web inventory that weren’t authorized. It was released after widespread domain spoofing issues where sellers pretended to sell premium inventory (like the Financial Times) that they didn’t actually have access to.In a fragmented advertising ecosystem, ads.txt serves as a method of improving transparency for demand side platforms. In fact, DSPs aren’t buying web supply that isn’t authorized via ads.txt.Two million publishers have adopted the spec since its initial launch over one year ago. It has become so widespread that AdExchanger says publishers who have yet to implement ads.txt are actually losing money.On web, ads.txt helped the industry distinguish real supply sources from fake ones, and after its immediate success and adoption, the next logical step was to extend the reach of ads.txt into the mobile app ecosystem.What is app-ads.txt?App-ads.txt is a text file app developers upload to their developer website, which lists the ad sources authorized to sell that developer’s inventory. Just like on web, the IAB created a system which allows buyers to know who is authorized to buy and sell specific in-app ad inventory, and who isn’t.How does app-ads.txt work for mobile apps?A DSP looking to bid on app inventory scans the app-ads.txt file on a developer’s website to verify which ad sources are authorized to sell that app’s inventory. The DSP will only accept bid requests from ad sources listed on the file and authorized by the app developer.How app-ads.txt can benefit your mobile appThe main benefits to app-ads.txt are capturing revenue from brand spend and fighting ad fraud.Capturing revenue from brand spendBrands today represent a growing and potentially significant revenue opportunity for developers. We can expect that many DSPs that adhere to app-ads.txt won’t purchase inventory missing the app-ads.txt file, just as they won’t buy unauthorized inventory on the web. Developers who don’t implement app-ads.txt are likely to be removed from DSPs’ pool of targeted media. Fighting ad fraudBad actors may forge apps that impersonate legitimate apps, and mislead DSPs to spend brand budgets on their forged inventory. Legitimate developers end up losing out on ad revenue that was originally intended for them. App-ads.txt blocks unauthorized developer impersonations and minimizes instances of fraud that ultimately hurt developers’ bottom line.App-ads.txt exampleSo, how can developers make sure their inventory is covered by app-ads.txt?Step 1. Provide the developer website URL in your app listingEnsure that your developer website is updated in the app stores. This website will be used by advertising platforms to verify the app-ads.txt file.Step 2. Reach out to all ad sourcesGet in touch with your direct ad sources and ask for their app-ads.txt line, according to the IAB’s structure below: Ad source domain, your publisher ID, type of relationship (direct or reseller), ad source IDApp-ads.txt example: ironsrc.com, 1234, DIRECT, 5678 Your direct demand partners should be listed as “direct.” If your partners are using third-party resellers to sell your inventory, such providers should be listed as “reseller.” In any case, you should not add any provider to your app-ads.txt file unless you or your partner have a direct relationship with them.Step 3. Publish an app-ads.txt fileCreate an app-ads.txt file in Notepad listing out all the lines you received, and save.Step 4. UploadUpload the file in the root of your domain website (example: www.example.com/app-ads.txt).While no one can predict what kind of immediate impact app-ads.txt is likely to have, there’s no doubt that any effort designed to make the mobile advertising ecosystem more transparent and secure is worthwhile. We’ll have to wait and see if app-ads.txt spreads as fast and wide as its web counterpart.

>access_file_
1484|blog.unity.com

Procedural Stochastic Texturing in Unity

Have you ever tried using tileable textures to cover large surfaces or add fine-scale detail to your meshes? If so, you’re probably familiar with the visible repetition pattern that quickly emerges with higher tiling values. At Unity Labs, we have developed a new texturing technique that avoids this issue, Procedural Stochastic Texturing.Tileable textures are a useful tool to add detail to your 3D scenes. However, whenever you want to texture large areas or achieve highly detailed surfaces using tileable textures and detail maps, tiling patterns become visible. This effect is illustrated on the left-hand side of the following picture. Typically, reducing repetition patterns requires bigger textures or hiding the visible repetition with additional objects. In this blog post, we provide a plugin that solves this problem for textures with stochastic appearances, such as rust, moss, bark, etc. Its impact is shown on the right-hand side of the following picture.Download the plugin here.Given an input texture, the plugin will procedurally generate an infinite texture that matches the appearance of the input. We do so by leveraging a state-of-the-art texture blending function that operates on a modified version of the input texture, which only requires three texture samples per fragment, and by fetching a small lookup-table. Since the plugin avoids the need for tiling, it makes it possible to cover larger surfaces with smaller textures and higher levels of detail without any repetition artifacts.The plugin provides a variant of the Standard shader, which is included in a Unity package. Once you have imported the package, select the StandardStochastic shader for your material. The interface is the same as that of the Standard shader, with the addition of two new controls at the top, corresponding to the two required setup steps:The Stochastic Inputs menu allows you to specify on which of the input textures you wish to enable our technique (it can be set to Everything, or specific textures, e.g., albedo and/or normal map).The Apply button pre-computes data for the selected stochastic inputs and updates the material, finishing the process. For each input texture, this step generates two intermediate textures located in the same folder.A few additional remarks:Because existing intermediate textures are loaded when they are found to skip generation times, any modification to the original texture requires deleting them both before hitting the Apply button again to force their regeneration.When switching between the Standard and StandardStochastic shaders, parameters are transferred back and forth.A StandardStochastic (Specular setup) shader is also provided for materials using the Specular workflow.The following examples show the prototype working with different materials using small (256²) input textures.This technique is only compatible with tileable stochastic and near-stochastic textures, such as textures resembling random noise when viewed from a distance and typically natural-looking materials without precise geometric shapes.This is a research prototype; it is not under active development. As such, we don’t guarantee maintenance or support but feel free to ask any question and offer feedback on this Unity Forum thread. We are currently working on a ShaderGraph implementation for custom shaders and compatibility with the new rendering pipelines. Look out for updates on the forum thread!Curious about how the plugin works? Check out our two recently published research papers on the subject, which our plugin builds upon:Paper: High-Performance By-Example Noise using a Histogram-Preserving Blending Operator - Eric Heitz and Fabrice Neyret, HPG 2018Technical chapter: Procedural Stochastic Textures by Tiling and Blending - Thomas Deliot and Eric Heitz, GPU Zen 2Check out our other research work at Unity: Unity Labs PublicationsThe code delivered with the plugin features comments referring to specific sections of the technical chapter.

>access_file_
1485|blog.unity.com

Creating an Interactive Vertex Effect using Shader Graph

We created an example interactive vertex displacement effect with Shader Graph and the Lightweight Render Pipeline to help you use these features to design effects. This post will walk you through our process. Get the demo project with the Shader Graph shader, example scene, and some example game assets from the 3D Game Kit, and follow along!The sphere in the video example below has a shader-based displacement effect that activates when we hit the space bar. In your game, you would assign this to some relevant gameplay event. In this article, we will look at how to create this shader using the Shader Graph package, and integrate the spacebar keypress trigger. The goal is to help you understand how to design effects in Shader Graph, and interact with them from your other C# scripts. The demo project contains the shader, the script that controls the shader, a preconfigured Lightweight Scriptable Render Pipeline (LWRP) Asset, and an example scene for you to get started with. If you prefer to view this tutorial as a video instead of text you can find it on the Unity YouTube channel.First, let’s look at how to set up Shader Graph and the Lightweight Render Pipeline. Open the Package Manager and select install on the Lightweight RP package. This will automatically install the correct version of Shader Graph.Once we’ve installed the Lightweight RP, we need to create a new Pipeline asset in the Project. Select Create->Rendering->Lightweight Render Pipeline Asset.We can then activate this pipeline asset by going to Edit->Project Settings->Graphics, and dragging the LightweightRenderPipelineAsset into the Scriptable Render Pipeline Settings field. If you are following along with the downloaded assets, this step has already been completed for you.Now that the Lightweight Render Pipeline is installed, we can look at creating a new Shader Graph. Let’s create a new graph in our project by selecting Create->Shader->PBR Graph. The PBR Graph allows us to create a new shader that takes inputs from Unity’s Physically Based Rendering system,so that our shader can use features such as shadows and reflections. Once we have created this shader, we add it to a new Material and attach the Material to a Sphere in our example scene by dragging and dropping the material onto the sphere.To achieve the effect, we will displace the vertices in our mesh along its normals by changing the output Position in the PBR Master output node. We will displace by using an Add node on the base Object Position of each vertex. By adding the Normal Vector to the base Object Position, we can see all the vertices become extruded, making the sphere appear bigger. To vary this displacement, we will multiply this normal vector displacement semi-randomly by using a Simple Noise node.When we click Save Asset, we can see in the Scene View that the sphere is now displaced based on Simple Noise.Unfortunately, there are seams in the displacement because the Simple Noise is being sampled based on UV space. To fix the seams by using Object Space for the Simple Noise instead of UV Space, we can simply add a Position node set to Object.To create the pulsation effect, we will scroll this Position output by adding it to a Time node, before sending it to the Simple Noise node. We can also use a Multiply with the Time node to vary the speed of the scroll.To control our displacement, we expose a new Shader Property in our Shader Graph. Shader Properties allow us to provide inputs to our shader via values entered in the Inspector, or via our own C# scripts as in this case. We will create a new Vector1 property named Amount and changed the Reference to _Amount. The reference field is the string name by which we will access and change the displacement via script. If we do not change this, it will use an auto-generated value. If the string does not match exactly, we will not be able to address our property via script so double check that both match, including capitalization.We use this Amount shader property in a Multiply node with the Simple Noise before it gets multiplied with the normal vector. This allows us to scale the noise before it’s applied to the vertex positions. Now, the Amount variable controls how much we displace each vertex in the mesh.To control this Amount variable, we have created a C# script called DisplacementControl and attached it to the DisplacementSphere GameObject. This script controls the _Amount variable by interacting with the property we created in our material which is assigned to the MeshRenderer component. We store a reference to the MeshRenderer component in the variable meshRender, and declare a new float variable displacementAmount.We use a simple lerp in the Update function to interpolate the displacementAmount variable to the value of 0. We then set the shader variable _Amount to the value stored in displacementAmount variable. This will update the Shader Graph’s _Amount variable, smoothing it over time to 0.We are using Unity’s default “Jump” Input Axis (which is assigned to the space bar by default) to set the value of displacementAmount to 1 when pressed.Now, when we enter Play Mode in the scene, we can see that by pressing the spacebar, displacementAmount gets set to the value of 1, and then slowly interpolates back to 0.To create the adjustable glow effect, we will output to the Emission in the PBR Master node. We use a Voronoi Noise node and Multiply it to a Color node. This will create a little modulation in the glow effect with some dark spots. Then, we use a Lerp node with another Color node as the base color, and use the Amount variable in the T input. This will allow us to blend between a base Color node and the Voronoi Noise color node using the Amount variable.Then, we will scroll the glow by using a similar setup as before. We use a Position node set to Object and add it to a Time node, and connect the output into the UV slot of our Voronoi Noise node.

>access_file_
1487|blog.unity.com

Updated Terms of Service and commitment to being an open platform

We've been building Unity for 15 years with the vision of creating an open and accessible tool to enable creators to build whatever you can dream of.Over the last week there was much confusion, and untrue statements were raised which we refuted. But most importantly we listened to you, our community that felt that the End User License Agreement (EULA)/Terms of Service (TOS) was too restrictive.When you make a game with Unity, you own the content and you should have the right to put it wherever you want. Our TOS didn't reflect this principle – something that is not in line with who we are.We believe the Unity Engine business model is the best way for developers to be successful. We charge a flat fee per-seat – not a royalty on all of your revenue. Building Unity takes a lot of resources, and we believe that partnerships make better services for developers and augment our business model – as opposed to charging developers to pay for Unity’s development through revenue share.Our TOS update on December 5 was an attempt to define what our terms mean for the cloud and an opportunity to make our business model clearer. After listening to developers, we realized how this language came across, and how it would impact your ability to choose.Today we have updated our Terms of Service, Section 2.4.* The language is at the bottom of this post.The TOS update highlights that developers can use any third party service that integrate into Unity.Some of these services will be supported, others will not.The distinction is that with a supported service, we understand the technology. We make sure the service and Unity work better together for developers. We also ensure that the supported service always runs well on the latest version of our software, so we can help future proof your project in Unity and ensure access to the latest tech.Additionally we have created, and will continue to create, our own services. We will integrate our own services, but we will not block developers from using competitive third-party services.When you obtain a version of Unity, and don’t upgrade your project, we think you should be able to stick to that version of the TOS.In practice, that is only possible if you have access to bug fixes. For this reason, we now allow users to continue to use the TOS for the same major (year-based) version number, including Long Term Stable (LTS) builds that you are using in your project.Moving forward, we will host TOS changes on Github to give developers full transparency about what changes are happening, and when. The link is https://github.com/Unity-Technologies/TermsOfService.Today’s change in our TOS means Improbable is no longer in breach by providing you a service, and that we are able to reinstate their licenses. But we do not consider them a partner, and cannot vouch for how their service works with Unity as we have no insight into their technology or how they run their business.We know Improbable was in violation even before the December TOS update and misrepresented their affiliation with us. Although SpatialOS is not a supported third-party service, it can continue to be used for development and shipping games.We are holding an AMA on r/Unity3d at 10 a.m. PST to discuss this TOS update in more detail.*Section 2.4 Working with Third Party Service Providers. Unity developers are free to use any service offered to Unity developers (each, a “Third Party Service”). Unity does not have any obligation to provide support for any Third Party Service provider or Third Party Service under this Agreement. Third Party Service providers may not, without Unity’s express written permission: (1) use a stylized version of any Unity name, trademark, logos, images or product icons, or other Unity-owned graphic symbols; (2) use a product name confusingly similar to a Unity product or that could be construed by Unity developers as being a Unity product or service; or (3) create or use any marketing materials that suggest an affiliation with, or endorsement by, Unity. All use of Unity’s trademarks must comply with Unity’s Trademark Guidelines.

>access_file_

[ 2018 ]

13 entries
1488|blog.unity.com

What(Games) talks the convergence of monetization and design

One of the biggest mistakes a developer can make is only thinking about monetization strategies after designing their game. The choices you make in the initial phases of designing your game will influence the success (or failure) of your app monetization model.We spoke with Matthieu Brossard, the Publishing Director at What(Games) by Gameloft. What(Games) is a French casual F2P mobile game publisher. In our conversation, we discussed the convergence of game design and monetization.At what point in the design/development phase does What(Games) consider ad monetization?Monetization is discussed from the kick-off phase. It is only viable to invest in game development if we have a good chance at making it profitable, therefore we think about the best ways to monetize each specific title from the get-go of your app monetization model.For example, if we know that the game’s monetization strategy will rely on ads for the most part, we start thinking about the best way to integrate them in the game as early as possible. We need to ensure that ad integration is seamless and feels natural.Can you give an example of how design and ad monetization work together in your games?The revive point in a game is a great example of the partnership that exists between our design and ad monetization teams. Many games can monetize during this natural pointcut. Playing the video after the second run tends to perform better than right at the moment of the click for example.Initially, our flow was this:run → die → play IV → revive → dieWe assumed users would leave the game instead of watching an ad if it was placed at the end of their run. However, once our monetization team discovered this phenomenon was not as present as anticipated, they presented their user behavior findings to the design team who were able to optimize the flow.Together, they changed the flow to this:run → die → revive → die → play IVThe new flow resulted an increase in CTROverall, both teams have important contributions to bring to the table when it comes to integrating design into our ad monetization strategy. They should share their learnings and instincts on a regular basis.How do you ensure that your design and ad monetization teams work in tandem from the initial planning phase of the game and even after it’s been launched?They consult at every stage of the project. It starts on paper, in the Game Design Document, continues with key version reviews to get a feel for it and lasts all the way into live game Management for the optimisations.What tips would you give to developers struggling to sync design and monetization?Get rid of any assumptions you have and rely on the data!Every game is different, so it is imperative that you formulate as many hypothesis as possible based on your past experiences and overall knowledge of the market. Once you make these assumptions, you should test them on relevant cohorts.Our industry is lucky to be really good at tracking user behavior, with access to convenient user acquisition channels. Take advantage of these channels in order to deduce what you or your users may not even be thinking.Seamless integration = optimal monetizationCreating a game is hard work, but ensuring that your monetization and design teams are in the loop with one another throughout the process of development will surely bring about success!

>access_file_
1494|blog.unity.com

Choosing the resolution of your 2D art assets

At game events and online, people often ask me this question: “I’m making a 2D game in Unity for both PC and mobile: what resolution should my assets be?” There is no simple answer to this question that covers all cases. Read this blog post to get a better idea of what’s the best course of action for your project. In recent years, we’ve been working on a lot of features that help you create 2D games in Unity: Sprite atlas, 2D physics, a Tilemap feature for rectangular, hexagonal or isometric worlds, the spline-based Sprite Shape, 2D animation, and more.Unity doesn’t express the size of an object in pixels and this can confuse artists who are creating assets for 2D games. “How big do they need to be?” As usual in game development, the answer to this question is “it depends”, but let’s go over a few concepts that will make the decision easier.Note: Pictures in this blog post used the beautiful 2D assets from the Asset Store, in this blog we use art by the artist Mikael Gustafsson.Making a 2D game in UnityIf you’re a pixel artist, a word of warning: most of the tips in this post don’t fully apply to your situation. In pixel art graphics you have something that’s extremely low resolution, and you want to blow it up to 2x, 4x, 8x, or maybe even more, its original resolution. It means that one pixel of your original art is now a square of 2x2, 4x4, 8x8 real pixels on the screen. So in general in pixel art, you don’t need to bother too much about the screen resolution, but you start from your art and the feel you want to convey (old-school, NES-era, 16-bit era, higher-resolution “modern” pixel art, etc.) and you scale it up a few times. Unity now has a Pixel Perfect solution in the form of a standalone package if you use the Built-in RP and included in the Universal RP and the 2D Renderer. It comes with a simple component to put on the Camera which will do the hard work for you and make sure the art stays crisp and aligned with the grid of real, small pixels on any screen. You can find more info on the Pixel Perfect in the documentation.Before we go into any consideration about choosing resolution, it’s worth remembering that when you’re authoring your assets it’s good to go for a higher resolution even if you don’t actually need it for the art that goes into the game. You can always scale down art, but you can’t scale it up without losing quality.Consider these scenarios: you might need to print some of the art for your game, or you want to increase the size of an element on a screen, or you want to create an “HD version” of your game for 4K monitors later on.For these reasons, as you are working on the art, consider using work files that are twice the resolution that you actually need or more, then scale them down before bringing them into Unity, or use import settings to reduce their size as they are imported into the engine.The Import Settings also allow you to define a Max Size and other compression settings per platform, so, for instance, you can have some assets on a certain resolution on PC and just half of it on mobile devices, where contained disk space is crucial.Tip: Unity offers a way to consolidate several Sprites into one through Sprite Atlases. In addition to being a way to save texture space, atlases also offer one unified way of controlling the Max Size rather than having to set it individually for every single sprite in your project.When defining asset size, it’s important to consider the platform or the devices that the game is going to be published on. People have very diverse devices and screens and will see your game in a variety of resolutions and aspect ratios.In general, at the time of writing this post, if you’re publishing for PC you are looking at a vast majority of users with 2 resolutions: mostly “full-HD” (1920x1080 pixels, often called 1080p) and a good deal of 1280x720 (often called 720p). A small percentage of people also have 4K monitors (3840x2160) or Retina monitors on Macs (a modern 15-inch MacBook Pro would have a maximum resolution of 3360 × 2100). That’s a lot of pixels to cover!For phones, the range is huge. Some old devices can go down to less than 720 pixels vertically, but some modern ones will be up to 4K.Tip: Unity makes some of these stats available in the Operate Dashboard of your Unity ID. Select a project that has Analytics enabled, and you will be able to go under Analytics > Market Insights and see the aggregated stats. Steam also offers a similar service on this page.With Retina (an Apple trademark) and other modern high-DPI screens, while the actual hardware resolution is very high (e.g. 4K), what they can do is to run at a simulated lower resolution (usually half, say for instance full-HD instead of 4K) but then they render images and text using twice the pixels, so that they appear very crisp.Note: DPI (dots-per-inch) or PPI (points-per-inch, or pixels-per-inch) are different names used interchangeably by different manufacturers, but at the end of the day, they mean the same thing: how many pixels are squeezed in a linear inch on screen. Traditionally, screens were 72 DPI. Today high-DPI screens are usually 144 DPI, but you can find phones that boast up to 400 DPI or more since they are packing a lot of pixels on relatively small screens. Some examples here.For these screens, you have two options. One is to aim at offering an experience that uses the 4K resolution in full. The downside is that producing 4K-compatible assets takes a lot of extra work. In this case, make sure to highlight it in your marketing materials!! (“A Beautiful 4K Experience”… etc.). Owners of consoles like PS4 Pro and Xbox One X, which are compatible with 4K, will love the fact that your game is using the hardware to its full power.Or, you can “just” architect your game to cover full-HD. In this second case, users with higher DPI monitors won’t benefit from the increased resolution of their screens but they will just see the game in full-HD. That’s not ideal, but it might be OK if you are also trying to keep build size under control.So the bottom line is: you need to choose a maximum resolution you are aiming for (based on the current market shares, see above), and set that as your target for the whole project. Everyone in the team will then be able to make decisions knowing that.As we mentioned before, Unity measures distances and sizes in something that is simply called a unit, not in pixels. In general, it’s good practice to match 1 Unity unit to 1 meter. For instance, the average humanoid model between 1.7 and 1.8 units in this scenario. This isn’t mandatory, but it will ensure that games with physics (both 3D and 2D) behave correctly, because physics in Unity are tuned to use 1 unit for a meter. Same goes for 3D lighting, where light parameters are meant to stay true to reality.In 2D this scale is less important, but it’s still good practice to respect it if you’re using physics in your project. If you’re using a Tilemap, it might be nice to keep a scale of 1 tile = 1 unit, just for the sake of simplicity.Now that we have gone over units, let’s move on to the camera. Unity’s 2D cameras (Orthographic) have a parameter called Size, which - when doubled - is telling you how many units this camera is framing on the vertical axis.With a size of 5, we have a viewport that measures 10 Unity units on the vertical. The horizontal axis will just be a consequence of this since we don’t know what aspect ratio the user’s screen will have. But it’s easy to calculate: on your average PC or Android phone, with an aspect ratio of 16:9, you can just do:10 (Vertical Size) x 16 / 9 = 17.7 (Horizontal Size)So we know that with these settings, we’re framing an area of roughly 17.7 by 10 units. On Macs (which are generally 16:10) it will be 16 by 10 (so less visibility on the horizontal). On a 16:9 phone held vertically (so it becomes 9:16), the same camera will show only an area of 5.6 by 10 units.Note: We won’t go over how to cope with aspect ratios in this blog post because if you’re aiming to make a game for different aspect ratios, not only you need to think of the graphics, but in general you need to make a lot of gameplay tweaks to make sure the game doesn’t play differently on devices with different ratios. For instance, any game that scrolls horizontally will benefit from a slender horizontal aspect ratio because the player can see more of the coming hazards. Sometimes making a game that works fine on wildly different aspect ratios is impossible, and people use frames or black bars to fill the negative space that they can’t fill with gameplay.When importing graphics as Sprites, Unity displays a parameter called Pixels per Unit (PPU). Now that we know all about units, this should be very clear. It’s expressing how many pixels from your Sprite fit into a unit in the Unity scene when the GameObject is scaled 1,1,1.Say for instance I have the Sprite of a rock that’s 218 by 175 pixels, and I set the Pixels per Unit to be 100, once I drag that Sprite in the scene my GameObject by default will be 2.18 by 1.75 units, occupying roughly one-fifth of the 10 units on the vertical axis.So let’s take a full-HD screen as our test resolution. The vertical is 1080 pixels, and the rock is less than a fifth of the screen (you can see how the faded rocks fit more than 5 times in the image above), it means we are using 175 pixels of source graphics to render more than 200 pixels. Which means that we are going to have a slightly blurry rock.To fix this, we have several solutions: we can scale down the rock to about half the size, make the camera frame bigger to 10.8 (which produces a zoom out), or we can change the PPU value of the Sprite to 108 (which has the same effect of making the rock smaller on screen). In all three cases, if we want the rock to be crisp, it will have to be smaller.Where are that camera size and the PPU value coming from? Easy! For the camera size, if we import our graphics at 100 PPU, then we would need a 10.8 camera because 10.8x100 equals 1080. This allows us to cover the whole height of the screen. Conversely, to calculate a correct PPU where the camera size stays on 5, if we hope to cover a full-HD screen with 10 Unity units vertically, then we have 1080/10 = 108. This is the number of pixels that we should be cramming in one unit if we don’t change the camera size.Keep in mind that as you’re working on your game, it’s dangerous to mix these workflows as you go: you might end in a position where some graphics used in the wrong scene have a resolution that’s too low and you didn’t even notice. It’s good to establish guidelines: one PPU for most of the asset in your project, and a typical camera size.Then you can break the rules later. Maybe you have a cutscene where the camera zooms in and out, changing the size temporarily. Or maybe your background elements are so big that you can’t afford to keep the same PPU because you would end up with enormous textures. In that case, it’s OK to reduce the PPU for those elements and import smaller sprites, but still cover a big area of the screen with them.As you are working in Unity, you might be wondering what’s the resolution that you are seeing through the Game View and whether that’s an accurate preview of your art on the target platform. Most of the options you will need are in the drop-down menu at the top and the slider next to it.Aspect ratios just force a specific ratio between horizontal and vertical, but they will make use of the current resolution of your Game View viewport, which in turn depends on your screen. So they’re good for setting up UI and objects on a screen, but not really for testing art. While you’re on an aspect ratio, the checkbox Low Resolution Aspect Ratios will be active if you are on a high-DPI screen. If you check it, it will simulate a standard DPI screen resolution. Fixed resolutions, on the other hand, force Unity to render a window of that exact size. At that point, you might need to expand your Game View or maximize it to visualize the whole preview. In this context, the Scale slider can make high resolutions fit into your Game View window, but if you’re going below 1x scale, then you aren’t actually seeing that resolution, but a resampling of it. And don’t forget that you can add your own fixed resolutions and aspect ratios to the drop-down:Tip: Don’t forget that you shouldn’t base your judgment on the editor only. Every now and then, make a build of the game (or just of the art!) and take a look at it on your target device or screen.As you can see, the resolution of your art, the camera size, and the screen you want to go for are all connected and there’s not one pixel size or PPU that fits all cases. Study your target platforms, decide on a resolution, and inform the whole team by making guidelines. Then produce higher-res art anyway, it might be useful later! Finally, resize it to the resolution you need and import it in Unity. And one final suggestion. Even if previously I used the terms “slightly blurry” as if it was the end of the world and something we absolutely MUST FIX, this is not a hard rule. It might be that in your game some objects - sometimes - are under resolution. Especially if there are a lot of minute details on screen, maybe transparencies overlapping, with some fog, rain, or post-processing on top. Play the game at its normal speed. Do you actually notice the fact that they are being re-sampled? If you honestly can’t tell the difference, then maybe the extra work, disk space, and processing needed to render a higher resolution sprite aren’t worth it. Remember: good games make big compromises!We have a reading list of 2D tips and how-to’s in our blog, don’t miss it. If you also want to get an overview of what Unity has to offer for 2D, you can start here.Discover the Unity 2D demo, Dragon Crashers in this blogDo you have some amazing 2D art in the works? Are you curious about our new 2D tools? Make something great with our new 2D animation tools, Tilemaps, SpriteShape, Pixel Perfect package or SVG importer, and join Unity 2D Challenge! I’m one of the judges and can’t wait to see all your brilliant ideas.

>access_file_
1495|blog.unity.com

Getting started with Unity’s 2D Animation package

Have you been looking for an easy way to create skeletal animation for your 2D sprites? We're introducing our own 2D Animation package, which allows you to rig 2D sprites, paint bone weights, and create skeletal animation, all in-editor! On top of that, there is support for Inverse Kinematics as well. You can check the feature out right now - it’s been available as a package in preview since 2018.1. Read on to learn how to use it effectively in your projects right now.2D Animation is currently shipped as a preview package, which means you can use Unity's Package Manager to install it for your project. The Package Manager is accessible from Unity 2018.1 or higher. Make sure to also grab the 2D IK package if you are planning on using Inverse Kinematics too - they will be covered in a separate blog post, coming out later this month.In order to start animating, we’re going to need a couple of things: the first is, of course, the sprite that we wish to animate. It can be anything you want - a character, a monster, an object… you name it!There are a few things to keep in mind when importing your sprites for 2D Animation. First of all, there is currently no layer support. This means that if you wish your rig to use multiple disjointed sprite meshes, e.g. for arms and legs, you will need to space them out directly on one layer. Full layer support is planned to be included in a later release of the package - where it will support layer import from PSD and other file formats. In addition, there is currently no integration with Unity’s new Vector Graphics package, which includes SVG file import. Support for this is also in the works.With all of that in mind, let’s import your sprite. The example we use in this article is Ríkr the Viking. We'll make this character available for you to experiment soon. It has been divided into multiple sprites and re-arranged on one layer:All of the 2D rigging happens in the Sprite Editor. To access it, select your character sprite asset, and then in the Inspector window you will find the Sprite Editor button.In the top left-hand corner of the Sprite Editor, you will see a dropdown with the Sprite Editor selected. Change that to the Bone Editor, and from here, we can start creating the first bones of our rig! In the bottom right-hand corner, you will see an array of bone-editing tools.To add your first bone, choose the Create Bone tool (shortcut ‘B’). If you now left-click anywhere in the Bone Editor, you will set the starting point of your Root bone. Left-clicking again will define the endpoint of the bone and the start of the next one. Right clicking will cancel the creation of the bone:When you are creating the root bone of your skeleton, you may want to consider what type of root bone placement will be the most optimal for your use case. Generally, there are two approaches to this - the first is to put the root bone at an arbitrary point outside of the mesh, usually between the character’s feet - using it as a so-called origin point; the second is to place it at the hips of the character and make it also act as the hip bone. The main point to consider here is whether you plan to implement Root Motion. It is typically easier to do if you isolate the root bone from the rest of the skeleton so that when you do want to use Root Motion, you can simply move the origin to move the whole character. This workflow is also useful when you want to create animations that change the hip placement, but don’t want them to change the root position.There is a selection of other bone editing tools accessible from the Bone Editor. One of them is the Free Bone tool (‘N’). It allows you to create a bone which still has a parent bone but is not directly connected to it. The Free Bone tool is useful for when you have a root bone at the origin which does not connect with the actual mesh of the sprite, as well as for when the elements of your sprite are divided into multiple parts - in this example, we can use free bones to connect our character’s limbs to the rest of the skeleton. Later, when we create our animations, we will be able to move each free bone to a desired position relative to the body of the character, so that they appear to form one complete sprite:The other tools allow you to Split (‘S’) an existing bone into two equal, smaller ones; Parent (‘P’) one bone to another; as well as Move (‘M’) any existing bones that are currently attached, turning them into free bones. Creating a bone by mistake is no big deal - you can get rid of a bone by selecting it and pressing Delete or using the equivalent tool from the tools panel.Once you’re done editing your skeleton, make sure you click the Apply button at the top right of the Sprite Editor UI! Otherwise, you will lose the changes you have made.Note: It’s a good practice to give descriptive names to your bones. If you have complex skeletons, navigating their hierarchies later will be much easier if you know what you are looking for!Once we have a complete rig, we can go ahead and generate the mesh for our sprite. Our mesh, together with bone weights which we will add later, will determine how our sprite gets deformed when we move and rotate the bones in the rig. In the top-left dropdown mentioned earlier, select the Skin Weights and Geometry Editor.From here, you can create your sprite mesh manually by adding vertices and edges. The tools for doing so are found in the top toolbar: Create Vertex, Create Edge, and Split Edge. However, this can be quite time consuming, especially if your sprites have detailed and complex outlines. For this purpose, Unity gives you the option to automatically generate your mesh, using a set of pre-defined parameters.In the top tool panel, click on the Generate dropdown.Here, you can set the Outline Detail of your mesh - which will determine how precisely it fits the graphic.Alpha Tolerance will determine how much transparency is taken into account when creating the mesh. Higher values will usually mean a mesh that more precisely fits the pixels (usually with quite a bit more vertices), however, if you set it too high, you might end up excluding semi-transparent segments of your sprite from the mesh. Also, keep in mind that extremely high-detail meshes might create a potential performance concern.The Subdivide level determines how many vertices (and therefore polygons) are created on the inside of the mesh, and might be useful for when you want to create facial animation or otherwise deform elements on the inside of your graphic. It is also useful for better mesh tessellation in case you wish to use custom lighting shaders.Generally, a good workflow here is to use the auto-generated mesh as a base, and add or remove details by hand where necessary using the editing tools.Once we’ve created our mesh, we need to add some bone weights to it. These will determine which vertices of the mesh are affected by each bone, and by how much. To start editing bone weights, in the top panel switch the editing mode from Geometry to Weights.Right away, you will notice that there is a new tool panel at the bottom right of the Sprite Editor. Using it, you can paint bone weights by hand; but just like with the mesh, you can generate the weights automatically with the Auto tool.While editing weights, you can also preview how the various parts of the mesh will be deformed when they are moved. To do so, you can select any bone in the skeleton by clicking on it, and try and move or rotate it to preview how the mesh is affected. You can also reduce the visibility of the painted weights and bones by adjusting the sliders in the top toolbar. This way you’ll get a better idea how the sprite will look in the scene.Bone weights will generally need more fine tuning than the mesh - you likely want precise control over which vertices get moved by each bone. The Brush and Slider tools in the bottom toolbar exist for this exact purpose!The Brush tool allows you to paint the weights directly onto the individual vertices of the mesh. You can change the Size of the brush, which determines how many vertices it affects; and the Hardness and Step settings allow you to determine how much the weight of one bone will overlay the weight of another if there is more than one bone influencing a vertex.You can start painting weights using the Add and Subtract mode. Simply choose a bone that you want to add weights for by either clicking on it in the Sprite Editor window or selecting it from the brush tool drop-down:The Slider tool allows you to more universally distribute bone weights, and much like the brush tool, it works differently based on the selected Mode.The Add and Subtract mode, in this case, allows you to add a selected bone’s influence to the whole mesh; while the Grow and Shrink mode increases or reduces the bone’s already existing area of influence, and therefore is more confined to the vertices directly around the bone.The Smooth mode is used to even out transitions between all bone weights in the skeleton. Using this mode with the brush tool allows you to do the same, but only for localized areas.When rigging your sprite, you will most likely want to have control over how its parts will overlay each other. For example, you might want the right arm of your character to appear in front of the body, and the left arm to be behind the body, and so on. As such, 2D Animation allows you to set the render order of your bones, called Bone Depth.Bone Depth is configured in the Weights editor and is set per-bone. To change the render order of a bone, select it in the editor window. At the bottom left of the window, you will see an Inspector pop-up. Here, you can set your rendering priority. Higher values mean that parts of the mesh associated with that particular bone will overlay those with lower values. By default, all bones have a depth of 0.Once you have set up a rigged mesh and painted bone weights, you will be able to use it in the scene. To do so, first drag the rigged Sprite asset into the scene, and add a Sprite Skin component to it. If a sprite contains a skeleton, you will have the option to Create Bones, which will instantiate that skeleton in the scene.From here, you are free to move and rotate the bones however you wish to achieve the desired pose for your character. To rotate a bone, either select it and use the Rotate tool (‘E‘), or grab the end of the bone and drag it in the desired direction. To move a bone or detach it from its parent, use the Move tool (’W’) or grab it at the origin and drag. To reset the bones to their default position in the rig, you can use Reset Bind Pose in the Inspector. You can also select multiple bones to rotate them uniformly as a group - this might be useful when trying to create rounded shapes, such as those for hair or roots.Let’s do a quick walkthrough of how you can use your newly built skeleton to create a complete 2D animation!With the 2D Animation package, you can create skeletal animations just like you would in 3D projects before. The simplest way to do so is to select the rigged sprite and open the Animation window (Window > Animation > Animation). From here, you will be prompted to create an Animator component, and the first Animation Clip for this game object. Once you click Create, you can choose the folder where you wish to save these components, and name your first Animation Clip. An Animator asset will be created automatically, but you can rename or move it wherever you wish. Alternatively, you could create the Animator first (Assets > Create > Animator Controller) and attach it as a component onto the game object. After that, you only need to create the Animation Clip.All the animating work happens in the Animation window - you can have multiple Animation Clips per Animator. To create new clips, at the top left of the Animation window you will find a dropdown with the name of the currently selected animation clip. If you click on it, you will see the option to Create New Clip.At the start of your animation, it is a good practice to keyframe all of your bone positions and rotations - even if you aren’t going to change all of them. Otherwise, if you later move or rotate the bones in the scene that haven’t been affected by the animation, they will keep their transforms from the scene, which often changes the animation in an undesirable way.In order to keyframe all of your bones and start recording the changes in the animation, you must first enable recording mode in the Animation window by pressing the red button at the top toolbar. To create your initial keyframe, select all the bones of the rig in the Hierarchy window, and in the Inspector you will be able to right-click the Position and Rotation transforms and choose to ‘Add Key’. This will create keys for all selected bone positions and rotations, respectively. To make sure you’ve selected all the bones and their children, you need to expand your bone hierarchy before you select everything. A quick shortcut to doing this and making sure you don’t miss any bones is to hold Alt (or Option for Mac) and Left Click on the root bone. This will either collapse or expand all of the root’s nested children.From here, while in recording mode, you can move the animation playhead to a new position, and start rotating or moving the bones in the scene to automatically create new keyframes.You can move keyframes along the animation timeline by selecting and dragging them to a desired point; or copy and paste selected keys with familiar shortcuts - Ctrl (Cmd) + C and Ctrl + V respectively. Once you’re done editing the animation, you can exit recording mode and use the Play button to preview the animation.And now you’ve got everything to get you started with creating your own skeletal animation! Here’s an example clip of a walk cycle for our viking that can be put together really quickly:There is additional information on working with 2D Animation in the preview docs, currently available on the feature’s GitHub repository.On the same repository, you will find several sample projects that you can play around with which demonstrate the basics of the workflow described in this blog post.Stay tuned to this blog and learn how to get Started with 2D Inverse Kinematics in the second part of this series.We would love to see what you are creating with our new 2D packages, and to hear about your experience using them!That’s one of the reasons we launched the Unity 2D Challenge. Create a small piece of content using some of our new 2D tools and you can win cash prizes and tickets to Unite! It can be anything from some pixel-perfect art to a thin vertical slice of a 2D game. The challenge is open for submissions until December 17, so now is a great time to get started.You can also share your thoughts and projects with us on the 2D Animation Forum!

>access_file_
1498|blog.unity.com

Performance Reporting is now Cloud Diagnostics

Better name, new features, and now accessible to Unity Personal Edition users! If you’re familiar with Performance Reporting, then Unity Cloud Diagnostics is everything you knew about it, but better. It has more tools to help you find and respond to user issues fast and in real time, and it’s now accessible to everyone!The Performance Reporting name was confusing because it didn’t do “performance reporting” the way most people thought of it. The name implied that it would monitor factors like memory usage and frames-per-second, which is not the purpose of the service.Cloud Diagnostics better represents what we help you measure – discrete technical issues. Furthermore, the types of issues that we track have expanded since Performance Reporting was initially released. Today, Cloud Diagnostics tracks:Unhandled managed exception reportsNative crash reports(NEW) User-submitted reportsSome of you may recognize that “User-submitted reports” is the “Bug Report” feature, which was previously available in an open Alpha. Like Cloud Diagnostics, we are renaming the feature User Reporting for more accuracy as it was common for people to report bugs as well as send suggestions.Crash and Exception reports are a window into what is happening under the hood when your players and/or testers are running your game. By enabling the feature, you’ll have access to information to help you debug faster, which includes stack traces, debug logs, device information, and custom metadata that you define.User Reporting is a great way to collect and aggregate reports from end users. The key difference is that while Crash and Exception reports are generated automatically, User Reporting can be explicitly generated by an end user, so you receive real human feedback. With this feature, you can get user-provided bug reports and feature requests, which can include screenshots, saved games, videos or anything else you find useful to help improve the user experience. You have the freedom to configure the reports in whatever way is valuable to you – be it the type of attachments you accept, the text information you collect or the custom metadata you track with each report.An important aspect of Unity Cloud Diagnostics is that it helps you react to users in real time. Instead of making you monitor yet another tool, Unity Cloud Diagnostics can work with your communication tools of choice. For example, you can integrate it with Discord, Slack, and Jira.Don’t use Discord, Slack or Jira? You can also build custom integrations using webhooks. Here are the four message events that you can access:New Crash or Exception: Triggered whenever a new type of crash or exception is seen.New Crash or Exception Version: Triggered when an existing crash or exception is seen for the first time on a new version of your game.Reopened Crash or Exception: Triggered when a previously closed crash or exception is seen in a new version of your game.New User Report: Triggered whenever a new user report has been received.The biggest change for most of you is that features are no longer limited to Unity Plus and Pro users. Unity Personal users can use these features through a new Starter tier. Additional capacity, as well as some sophisticated features, are available to Plus and Pro subscribers through an Advanced tier. Here’s a full breakdown:Enabling Cloud Diagnostics in your project is easy! If you want to use the Crash and Exception features, you only need to turn them on using a toggle in the Services window:Since User Reporting is often used for manual feedback, there’s an SDK available that has a prefab you can reskin or use for guidance on setting up your own UI. Documentation for all features can be found here.We appreciate all of you who are using User Reporting and have been giving us really useful feedback. As a thank you, we will be giving all projects that used User Reporting during the Alpha access to Advanced Diagnostics! This means that you will not lose access to advanced features or be restricted by Starter-tier limitations for the projects that have been actively using the service. We’d particularly like to call out WashBear Studio, creators of Parkasaurus: thank you for your great feedback about User Reporting!We are really excited for Unity Cloud Diagnostics and we hope you are as well. If you’d like to request a feature or report an issue, you can reach out to us on the Cloud Diagnostics forum or at clouddiagnostics@unity3d.com.

>access_file_
1499|blog.unity.com

Show the world what you can do with 2D

There’s no better opportunity to start a new page of your notepad and start sketching your games ideas because we are celebrating the recently released new 2D tools!The new tools open up a new world of possibilities. You can import vector graphics to reduce sprite sizes and scale at any resolution, make pixel art or retro gaming camera component, organic & efficient environments with SpriteShape, new types of tilemaps, sprite rigging animation in 2D… we will love to see what you can create using these tools to design your 2D demo.Join the ChallengeThe new suite of 2D tools can spark new game ideas, since you are no longer restricted to traditional 2D sprites for your game, side-scrolling or top-down grids. Even better, you can create your own tools using the new 2D tools APIs.We’re looking for demos making a great use of the new 2D tools, you can focus on original game mechanics, gorgeous graphics, or even better, a mix of all those. Just make us say Wow!You must at least use one of the new 2D features in a prominent way in your project, but you don’t need to use all of them. Here’s a list, with links to resources:2D Animation2D Inverse Kinematics2D Cinemachine,2D Tilemaps (square, hexagons or isometric),2D SpriteShape,2D Pixel Perfect,Vector Graphics (this package includes an SVG importer and generic vector graphics APIs).Starting on 23rd October 2018 (during Unite LA’s keynote) until 17th December 2018 you can submit your game projects on Unity Connect.The project can’t have been published before or use copyrighted material.For the submission, you should send a visual presentation of the project and some insight into the making of (sketches, clips using the editor, visual explanations of how you created an interesting part of the project with GIFs…).Remember to use the hashtag #Unity2DChallenge when either submitting your project or while working on it and sharing the progress on social media, we will love to see what’s being cooked.A 1st prize of $2000, 2nd prize of $1000 and a 3rd prize of $500 will be awarded to the best projects submitted for our judges.You can also opt for a special prize of $500 if you create an original editor tool that can help you create your game using the new 2D tools APIsKenney Vleugels - kenney.nl & pixeland.ioAngelos & Nick - Pixel ReignCiro Continisio - Unity EvangelistAndy Touch - Unity EvangelistRus Scammell - Unity Technical Product Manager, 2DPeter Lee - Unity Art Director Content TeamAre you new to Unity and need something to trigger your dormant game creation passion? YES.Are you a 2D artist and want to put your skills to test (and show-off) YES.Are you a Unity fan eager to try the new Unity features? YES.Are you an experienced developer busy working on complex projects? YES (have some fun on the side!)You are on your own? Perfect.You are a team of 4 friends? Perfect, but no more please.Get started with Unity for 2D: https://unity3d.com/learn/tutorials/s/2d-game-creationNew 2D Game kit: https://unity3d.com/learn/tutorials/s/2d-game-kitUnity 2D documentation manual: https://docs.unity3d.com/Manual/Unity2D.htmlNine 2D tools to make your life easier: https://unity3d.com/how-to/highlights-from-end-to-end-2D-toolset

>access_file_