Royalty Free Music for Game Developers
If you found this page, you probably wondered at some point, "Why is it called Royalty Free Music when none of it is FREE?!" and maybe you already learned the definition of Royalty Free does not necessarily mean the content itself is free but rather refers to the "right to use copyrighted material without the need to pay license fees or usage fees."
To put it simply, Royalty Free just means you don't have to pay to use the content. So why do so many people charge for it, and isn't that exactly the opposite of what Royalty Free means? Obviously artists and content makers who spend a lot of effort making something should be rewarded for their time, but in my opinion, they should not use the term "Royalty Free" to refer to their paid content.
That is why I developed a FREE Royalty Free Music for YouTube Library for anyone and everyone who is looking for free professional music to use in their films, television shows, youtube promos, video games, you get the picture.
How to Download
To begin, simply choose a category of music from the list on the website linked above:
Once you have selected your desired category of music, hover your mouse or finger over it and click the red button that appears to open up a small window with a media player in it that will look something like this:
Click a song to listen to it, and click the down arrow to download the song, it's that easy!
Licensing & Copyright Information
If you like the song and you want to use it in a non-commercial project (something that does not earn you money) you can do that for free simply by attributing me in your credits somewhere! Just put "Music by Jordan Winslow on https://jordanwinslow.me/royaltyfreemusic" or simply "Music Downloaded from https://jordanwinslow.me/royaltyfreemusic"
If you would like to use the music in a commercial project (something that earns you money) there is still no charge, but one more step: simply fill out the Royalty Free Music Commercial License Request Form.
Downloading Free Royalty Free Music for your YouTube creations has never been so easy!
Epic Games has launched Unreal Engine Online Learning, a new learning platform for Unreal Engine 4 users. The video tutorials illustrate common workflows in great detail to help creators across a range of industries master Unreal Engine for their own projects.
Unreal Engine Online Learning content is split into several tracks:
Media and Entertainment
Content is also sorted by job roles and engine features. Videos are available anytime, on demand.
Unreal Engine Online Learning includes a lot of the great video content users have seen on the Unreal Engine website in the past, plus dozens of new videos on common workflows, the latest features and much more.
To access Unreal Engine Online Learning, use the Video Tutorials option under the Learn tab at unrealengine.com.
Chaos Group has released the open beta for V-Ray Cloud, a push-button cloud rendering service for artists and designers running on Google Cloud Platform. V-Ray Cloud can turn a workstation into a supercomputer, unlocking the unlimited power of the cloud to help users hit deadlines and render faster than ever before. V-Ray Cloud will be free to use throughout the beta.
V-Ray Cloud removes the hardware barriers that keep most cloud rendering users at bay, simplifying the process for the everyday artist. Soon, with a click of a button, users will be able to render an entire animation in the time it takes to render a single frame, without having to track assets, juggle licenses or set up virtual machines. V-Ray Cloud will also be integrated into all V-Ray products, bringing a rapid speed boost to artists and designers in a pinch.
“V-Ray Cloud will be there whenever you need it, as a natural extension to your workflow,” said Boris Simandoff, V-Ray Cloud Director of Engineering at Chaos Group. “As needs arise, just submit a job and we’ll do the rest. It’s really simple.”
V-Ray Cloud Features:
Google Cloud Power – V-Ray Cloud is currently deployed on the Google Cloud Platform, providing a highly secure, scalable system for on-demand, high-performance rendering.
Smart Sync – Automatic scene updates for new changes, ensuring fast upload times and renders.
Smart Vault – An optimized storage system keeps assets in the cloud for use in future projects.
Remote Control – Job settings can be changed and resubmitted from any device, without opening the scene.
V-Ray Cloud is now open to all beta testers. Apply now to try free cloud rendering for a limited time.
Chaos Group will be demoing V-Ray Cloud at SIGGRAPH 2018 in the Google booth (#1423) on August 14 from 3:00 p.m. to 6:00 p.m. and on August 16 from 9:00 a.m. to 12:15 p.m.
The Khronos™ Group, an open consortium of leading hardware and software companies creating advanced acceleration standards, announces updates to key standards and opens the Khronos Education Forum at SIGGRAPH. With various Khronos events throughout the week, including a day of Birds of a Feather (BOF) sessions and its annual networking reception, Khronos is accelerating the development of open standards ecosystems and continuing its commitment to the SIGGRAPH community of interactive graphics professionals. At SIGGRAPH, Khronos will talk about the following standardization developments and initiatives:
First public demonstrations of the OpenXR standard for portable AR and VR At SIGGRAPH 2018, we’re excited to show the first public demonstrations of OpenXR™, using two prototype implementations being used to exercise the specification before finalization and release. Epic’s ‘Showdown’ VR Demo will be running portably across StarVR HMDs and Microsoft Windows Mixed Reality headsets using the OpenXR APIs via an Unreal Engine 4 plugin. No changes are necessary tothe application to run across these diverse devices, which proves the portability ofOpenXR and shows it in action. Learn more about how the OpenXR specification is being developed with this detailed OpenXR blog released today.
NNEF 1.0 Specification Released Khronos is announcing the ratification and the public release of the NNEF™ 1.0 (Neural Network Exchange Format) specification. After gathering feedback from the industry review of the provisional specification, Khronos is releasing NNEF 1.0 as a stable, flexible, and extensible open standard for hardware manufacturers to reliably deploy optimized, accelerated neural network inferencing onto diverse edge devices. Together with this release, an ecosystem of tools is now also available on the Khronos NNEF Tools repository, including an NNEF parser and converters from Tensorflow and Caffe. Importers into popular inferencing environments, including Android’s Neural Network API (NNAPI) and Khronos’ OpenVX™, are also being developed. The NNEF 1.0 specification and documentation is freely available on the Khronos website, with more detailed information in the official NNEF 1.0 press release from today.
glTF Ecosystem Grows; New Extensions Released glTF™ continues to gain strong industry momentum as the open standard 3D transmission format with support from major players, including Microsoft, Google, Facebook, Adobe, Epic, Unity, and from a vibrant grassroots open-source community. In addition to the recent extensions for Google Draco mesh compression to reduce file size and to unlit materials for mobile performance and photogrammetry use cases, the glTF working group has just released a texture transform extension to facilitate texture atlases. The glTF working group is now working on a significant extension for texture transmission, using a universal super-compressable texture format that is compact to transmit and can be efficiently transcoded to a range of GPU-accelerated texture formats at high-quality levels. The industry is invited to provide requirements and feedback on the texture transmission format via GitHub.
Call for Participation - Education Forum Opens for Public Contributions The Khronos Education Forum is launched today to provide an open-to-all platform on which to openly share, coordinate and collaborate on course materials for Khronos standards. Khronos will manage an Open Educational Resource hub so that educators can upload materials and receive feedback and ideas from the education community and Khronos working groups. All course materials within the Education Forum will be freely available to support educators around the world who are teaching curricula that include Khronos specifications.
The Khronos Group is hosting educational sessions and networking events this week at SIGGRAPH, including a full-day of BOF sessions with talks from diverse members and developers.
Khronos BOF Sessions on glTF, WebGL, OpenXR, Vulkan, and OpenGL: On Wednesday, August 15 starting at 9 a.m., join Khronos for a full day of sessions covering Khronos standards and ecosystem updates. The BOFs will take place at the Marriott Pinnacle, a few minutes from the convention center.
Khronos Networking Reception: everyone at SIGGRAPH is invited to join Khronos for refreshments and conversation at the annual networking reception sponsored by members NVIDIA, LunarG, Cesium, and AMD. Many Khronos BOF speakers and members will be at the reception to discuss Khronos standards, implementations, toolsand trends!
glTF Ecosystem Forum Meetup: Khronos will also be hosting a glTF Ecosystem Forum meetup at SIGGRAPH on Thursday, August 16. If you are interested in attending, please send an email to firstname.lastname@example.org with your name, company, and interest in glTF for an invitation - spaces are limited.
Today, Vicon announces the release of “Origin,” a comprehensive location-based virtual reality (LBVR) system that blends Vicon’s unmatched, Academy Award-winning tracking abilities with tools that make it easy for anyone to setup and operate. Building on nearly 35 years of industry-leading R&D that’s made it the leader in engineering, healthcare and motion capture for blockbuster films, Vicon’s new fully scalable solution is designed to empower VR creators big and small by offering unparalleled quality and reliability.
“The location-based VR industry is poised to explode, but for fans and developers to really embrace it, the experience needs to be flawless,” said Dreamscape Immersive co-founder and CTO Caecilia Charbonnier. “When Vicon first approached us about working with them on Origin we demanded the highest level of precision, and Vicon did not disappoint.”
Origin includes three brand new pieces of innovative hardware, along with software created specifically for Vicon’s new LBVR system, all curated to meet current and future market demands. Following three years of development working alongside creators like Dreamscape Immersive, Origin contains everything a commercial enterprise would need to drive a fully immersive experience. Auto-healing software capable of repairing calibrations between sessions, paired with tracking that never fails also ensures that the system requires minimal training and maintenance in order to operate perfectly every time.
The Origin suite consists of the following:
Viper – A compact, lightweight tracking camera specifically designed to work with active marker technology.
Pulsar – Wearable tracking clusters that emit unique, active infrared LED patterns synchronized to ensure optimal battery life, designed to easily attach to a participant’s body, limbs and head-mounted displays.
Beacon – Creates a synchronized wireless network connecting to Pulsar clusters (or other devices), allowing them to communicate with Viper cameras and provide seamless connectivity.
Evoke – A highly automated software platform featuring unbreakable tracking and auto-healing between sessions; a fully featured API, seamless game engine integration and the ability to run Evoke entirely in the background result in a flawless capture experience that anyone can operate.
With Origin, multiple participants can appear simultaneously as characters in the LBVR environment, each with fully animated avatars driven by Evoke’s real-time tracking. Participants wear the Pulsar clusters on their body, limbs and head-mounted displays, contributing to creating a fully immersive experience where their every movement is recreated in the virtual world. Participants can also interact with others and react to them based on their movements. Tools and props can utilize clusters as well, allowing them to be passed between participants to increase the immersion.
“We have decades of experience creating the most precise and robust optical motion capture systems in the world, so the expansion into LBVR was natural,” said Vicon CEO Imogen Moorhouse. “Origin has been perfected over the last few years, built upon the foundation of our proven solutions, and designed to meet the current and future needs of this exciting industry.”
Origin will be on display at SIGGRAPH 2018, August 14-16, as part of Vicon’s booth presentation. Attendees can stop by booth #1239 and see the system in action thanks to an interactive demo created and showcased by Dreamscape Immersive. Demos will run from 10:00 a.m. to 6:00 p.m. PDT.
At this year's SIGGRAPH Conference in Vancouver, British Columbia, Epic Games will be demonstrating the power of its Unreal Engine technology to drive the most advanced real-time workflows across film and TV, VR/AR, games, architecture, automotive, product design, marketing and manufacturing.
Throughout the conference Epic will be highlighting breakthroughs from customers such as Ford, Technicolor and Zaha Hadid to share practical workflow tips and best practices with the user community. With continuous theater talks in the Unreal Engine booth (1401), a full day of Unreal tech talks, and exciting reveals during Real-Time Live!, SIGGRAPH attendees from all industries will be able to see the latest developments in Unreal Engine.
The schedule of Epic's sessions and events at SIGGRAPH 2018 is available here: https://www.unrealengine.com/en-US/events/siggraph-2018
Kicking things off on Tuesday, August 14 at SIGGRAPH's must-attend Real Time Live! competition, Epic will be showing a new version of the "Reflections" demo, first revealed at this year's Game Developers Conference in partnership with ILMxLAB and NVIDIA. Set in the Star Wars' universe, the demo showcases techniques for virtual production as well as photorealistic real-time ray tracing. At SIGGRAPH, Epic and ILMxLAB will use virtual reality and Unreal Engine's Sequencer cinematic editor to film a live character performance within 'Reflections,' edit the sequence and play the new short film featuring real-time ray tracing at 24fps.
Epic will also be leading Unreal Tech Talks on Wednesday August 15 in Meeting Room 16, Vancouver Convention Center East Building, featuring presentations on real-time production, live performance motion capture, real-time ray tracing, virtual production, and more. The schedule is as follows:
9:30-10:30am: Real-Time Raytracing Advances in Unreal Engine: Explore the latest in raytracing advancements in Unreal Engine, including a deep look at our new path-tracer and progressive lightmapper.
11:00am-12:00pm: Fortnite - Advancing the Animation Production Pipeline: Understand new improvements in UE4 to dramatically streamline your animation pipeline, including Sequencer improvements, Shotgun integration, and much more.
12:30-1:30pm: Real-Time Motion Capture in Unreal Engine: Learn how Unreal Engine is helping to bring the power of real-time motion capture to film, games, theater, and beyond.
2:00-3:00pm: Virtual Production with Unreal Engine 4.20: Take a closer look at how new tools in Unreal Engine 4.20 are enabling a new wave of production workflows that integrate real-time VFX with live action footage.
3:30-4:30pm: Mixed Reality Production Using Unreal Engine 4.20: Learn more about the pipeline and processes for designing immersive mixed-reality experiences with live 3D characters.
In addition, Epic will host a series of theater talks on the SIGGRAPH expo floor in booth 1401 covering diverse customer use cases, from creating location-based VR experiences, to leveraging mixed reality tools for live broadcast, to photorealistic architectural visualization. Select theater talks include:
Taking The Weather Channel into the Mixed Reality Future: Join The Future Group for a behind-the-scenes look at how they used Unreal Engine to help an AR tornado destroy The Weather Channel's studio during a live broadcast. Presenters: Lawrence Jones and Justin LaBroad of The Future Group.
Creating ILMxLAB's Location-Based VR Experiences in UE4: Get an exclusive look at how ILMxLAB used Unreal Engine to design leading location-based VR experiences including the Academy Award-winning 'Carne y Arena' and 'Star Wars: Secrets of the Empire.' Presenter: Mohen Leo of ILMxLAB.
Unreal Collaborative Design: Crowdsourcing 3D Assets for Immersive Experiences: Learn about Technicolor's upcoming 'HP Mars Home Planet' VR experience and how they were able to populate a large environment with a variety of crowdsourced CAD data. Presenters: Brian Frager and David Witters of Technicolor.
Creating 3D Virtual Driving Environments for Simulation: Learn how Ford uses Unreal Engine-powered environments to simulate real world scenarios for autonomous vehicle testing, resulting in the reduction of physical road testing. Presenter: Ashley Micks of Ford.
Unreal + Shotgun: Real-Time Production Pipelines: Take a deep dive into the new Unreal/Shotgun integration features available in Unreal Engine 4.20. Enable deeper and richer pipeline development capabilities, with an eye toward maximizing efficiency in real-time workflows. Presenter: Ryan Mayeda of Epic.
The New 3R's: Real-Time Rendering Revolution: Explore high-level examples of how real-time engines are transforming processes in manufacturing, design, architecture, and entertainment. Presenter: Ken Pimentel of Epic.
Lastly, Epic will also host an Unreal Engine customer event on Wednesday, August 15 at 6:00pm, at the Pinnacle Hotel Harbourfront. This casual cocktail gathering provides a great networking opportunity and chance to connect with other creative minds in the professional user community.
For more details on Epic's presence at SIGGRAPH and the latest schedules for theater talks, visit: https://www.unrealengine.com/en-US/events/siggraph-2018.
I have prepared a video tutorial that explains how you can make a game in 5 minutes with LGCK builder. This small video is a quick introduction to use this amazing tool that can be used to make any kind of 2D games.
Okay, You have watched the video and still have questions. Maybe you want to create a top down 2D map. There is an option to disable gravity on each given level. You simply right-click your level in the Toolbox and select Edit Level. On the Advanced tab, there is a no gravity checkbox.
There are two main requirements for a given level to work. First, you must include a Player Sprite. This is easily solved by creating it with the Sprite Wizard and drag & dropping your player into your newly created level.
Second, you need at least one goal. The easiest way to solve that problem is use the Sprite Wizard and select Object as the sprite type. There is a checkbox called Automatic goal that you can check and this will mark any instance of this sprite as an objective going toward the completion of the level.
Objects die when they are touched by the player and you can set several options including points, health rewards and sound to play by right clicking your object and selecting Edit Sprite. If you didn't check automatic goal, you can always right click on any of those Sprites on your level and select Customize in the context menu. Once you are in the property box just check the attribute Goal.
If you don't have sprites of your own, LGCK builder comes with some sprites in the tutorial folder. It also comes with a sample demo game that highlight some of the engine capabilities and how to use the event model to do some neat tricks (like breakable bridges, animated lava, or squares that change colors when you step on them etc.). You can find more sprites and sounds here: https://sourceforge.net/projects/lgck/files/Resources/
Please note that many of the sprites you can find on the web come in the form of sprite sheets. LGCK builder can import most of them with minimal intervention. Go to File, Import, Import Images. Click the + button to add a new png file. Once the sprite sheet is in the box, right click on it and select Split image from the context menu. You'll be given the choice of picking the right tile size (16x16, 24x24, 32x32 etc.). If you need to edit this image set further, LGCK builder IDE comes with it's own Sprite Editor.
You can download LGCK builder from freewarefiles:
You can find more information at the official site:
Ziva Dynamics launches Ziva VFX 1.4, a major update to its character creation technology that adds five new tools for production artists. Thanks to insights from the community and key customers like Sony Pictures Imageworks and Scanline VFX, Ziva VFX now helps creators apply real-world physics to even more parts of the character creation process, including muscle growth, tissue tension and natural elements like heavy winds and water pressure.
“Ziva’s character tools have really worked out for us,” said John Hughes, Founder of Rhythm & Hues and Tau Films. “The updates speed up simulations and give artists even more control and creative flexibility. Our creature pipeline is really hitting its stride.”
Already in use at Double Negative and Rising Sun Pictures, Ziva's groundbreaking technology fundamentally changes the character creation process by combining the effects of real-world physics with the rapid creation of soft-tissue materials like muscles, fat and skin. By mirroring the fundamental properties of nature, users can produce CG characters that move, flex and jiggle just as they would in real life, removing difficult steps from the rigging process.
With External Forces, users are able to accurately simulate how natural elements like wind and water interact with their characters. Making a character’s tissue flap or wrinkle in the wind, ripple and wave underwater, or even repel towards or away from a magnetic field can all be done quickly, in a physically accurate way.
New Pressure and Surface Tension properties can be used to “fit” fat tissues around muscles, augmenting the standard Ziva VFX anatomy tools. These settings allow users to remove fascia from a Ziva simulation while still achieving the detailed wrinkling and sliding effects that make humans and creatures look real.
When it’s time to adjust muscles, Muscle Growth can rapidly increase the overall muscle definition of a character or body part without having to remodel the geometry. A Rest Scale for Tissue feature has also been added, letting users grow or shrink a tissue object equally in all directions. Not only do these tools improve collaboration between modelers and riggers, but they also increase creative control for independent artists.
Ziva VFX 1.4 also marks the introduction of Ziva Scene Panel, a frequent request from advanced users. Now, artists working on complex builds can visualize their work in an effortless, compact way. Ziva Scene Panel’s tree-like structure shows all connections and relationships between an asset’s objects, functions and layers, making it easier to find specific items and nodes within a Maya scene file.
“With Ziva VFX 1.4, artists are able to make faster changes at every stage of the design process, in a more intuitive way,” said James Jacobs, Co-CEO of Ziva Dynamics. “As the need for realism grows, artists are constantly being asked to break the mold. With Ziva VFX, they can.”
At SIGGRAPH 2018, Ziva Dynamics will be showing Ziva VFX 1.4 and character results from blockbuster films in the Intel Booth (#1300). To schedule a demo or meeting at the show, please contact: email@example.com.
For a free 60-day trial, click here.
We are proud to announce that the PocketArena SDK for HTML5 games release is approaching fast. Compatible with HTML5 games, Unity games and Cocos2D JS games, it will empower you with all the tools and features that come with the PocketArena ecosystem, incentivising and rewarding the users of your game(s) through the use of PocketArena Tokens (POC). Join our Telegram group today to try out some PocketArena games and see the benefits firsthand. Please feel free to ask us any questions you may have. https://t.me/emojigames
Simple UI Design Widget is user interface set for your UE4 projects and created entirely with Blueprint classes. Simple UI Design UMG contains various user Interface styles which consist of common popup, scroll, text input, slide, radio button, drop down box, toggle button, check box.
- It is created entirely with Blueprint classes
- Full mouse and keyboard key binding
- Common popup, scroll, text input, slide, radio button, drop down box, toggle button, check box style UI
- Accept, default and cancel button function
- Responsive UI design
- UI sound effects ( common popup, warning popup, success popup and click )
- More than 40 high resolution images
Fishing Basics is a simple fishing system for Unreal Engine 4 (UE4). It is created entirely with Blueprint classes and developers can easily customize them. In this project, you can throw a hook and lines into the water at the desired distance and direction and reel the line. AI fish behave in Roam and Chase state. Also, the fishing rods are animated into Idle, Bite and Hook state depending on the situation.
- A user can spool a spinning reel which has transform animation
- A user can throw a hook and lines into the water at the desired distance and direction
- Two AI Fish mode (Roam and Chase)
- Three Fishing Rod mode (Idle, Bite, Hook)
Marketplace Page : https://www.unrealengine.com/marketplace/fishing-basics
Continuing its update-filled summer, Allegorithmic announced the latest additions to Substance Painter. The Summer Update introduces subsurface scattering (SSS), a feature prized by some of the biggest VFX and film companies in the world, along with new projections and fill tools, enhancements to the UX and support for a host of new meshes. The update is available now.
VFX, game studios and individuals working at a professional level are constantly looking for ways to create realistic lighting effects that help them separate the good from the great. Using Substance Painter’s newly updated shaders, they will be able to add subsurface scattering as a default option. Artists can simply add a Scattering map to a texture set and activate the new SSS post-effect. Skin, organic surfaces, wax, jade and any other translucent materials that require extra care will now look even more realistic, with redistributed light shining through from under the surface. And thanks to direct Iray integration, artists can do it all without leaving Substance Painter.
The release also includes updates to projection and fill tools, beginning with the user-requested addition of non-square projection. Images can be loaded in both the projection and stencil tool without altering the ratio or resolution. Those projection and stencil tools can also disable tiling in one or both axes. Fill layers can be manipulated directly in the viewport using new manipulator controls. Standard UV projections feature a 2D manipulator in the UV viewport. Triplanar Projection received a full 3D manipulator in the 3D viewport, and both can be translated, scaled and rotated directly in scene.
Along with the improvements to the artist tools, the Substance Painter Summer Update also includes several updates designed to improve the overall experience for users of all skill levels. Substance Suite users will also see an immediate benefit thanks to an improved consistency between tools. Additions like exposed presets in Substance Designer and a revamped, universal UI guide makes it easier than ever to jump between them.
“While the Substance Painter team has been working on this update for more than four months, we’ve been working toward subsurface scattering for a long time,” said Sebastien Deguy, CEO and founder of Allegorithmic. “This addition will open up a new range of options for users of all skill levels, and ensure that they have all the tools they need to expand into new levels of realism.”
Additional updates include:
Alembic Support – The Alembic file format is now supported by Substance Painter, starting with mesh and camera data. Full animation support will be added in a future update.
Camera Import and Selection – Multiple cameras can be imported with a mesh, allowing users to switch between angles in the viewport; previews of the framed camera angle now appear as an overlay in the 3D viewport.
Full gITF support – Substance Painter now automatically imports and applies textures when loading gITF meshes, removing the need to import or adapt mesh downloads from Sketchfab.
ID Map Drag-and-Drop – Both materials and smart materials can be taken from the shelf and dropped directly onto ID colors, automatically creating an ID mask.
Improved Substance Format Support – Improved tweaking of Substance-made materials and effects thanks to visible-if and embedded presets.
The Substance Painter Summer Update is available now at no cost to Substance subscribers. Attendees at SIGGRAPH 2018 will also be able to see live demonstrations of the latest features for all Substance tools at Allegorithmic’s booth #415.
Pricing/Availability The new update to Substance Painter is available today. Following the 30-day trial period, individual users will be able to subscribe to the Substance Indie or Pro plans. Substance Painter is also available for individual-license purchase, which includes 12 months of maintenance. Subscriptions to Substance Indie cost $19.90/month; Pro plans cost $99.90/month. Enterprise and education pricing is available upon request. Students and teachers can request a license at no cost.
Unity Technologies (https://unity3d.com/), creator of the world’s leading real-time 3D development platform, announced today a strategic partnership with Google’s mobile advertising business, AdMob, that will change the way advertisers reach gamers on-the-go and help mobile game developers monetize their apps. With more than 50 percent of all new mobile games made on Unity, this partnership gives Google advertisers access to Unity’s extensive network of mobile gaming titles through Universal App campaigns, helping advertisers reach consumers around the world on both Android and iOS. Unity’s scale and expertise as the go-to engine behind mobile games combined with Google’s global brand leadership gives advertisers a seamless, integrated way to engage high-value audiences like never before.
“Everyday nearly 9 billion minutes of time is spent with consumers engaging in mobile game content built with Unity. This remains a largely untapped opportunity for performance and brand advertisers. Advertisers can win users' mindshare, drive performance, and scale campaigns to new heights by tapping into this valuable global audience,” said Julie Shumaker, Vice President of Advertiser Solutions at Unity Technologies. “By partnering with Google, we are unlocking access to this powerful channel and providing advertisers the opportunity to drive value with high engagement ad experiences.”
To date, the partnership has resulted in video completion rates of 87.3 percent compared to the overall average of 32.5 percent with publishers such as NHN Entertainment, Yodo1 and Doodle Mobile based on internal Unity reports. The Unity ad platform is one of the largest and most powerful in the world, reaching 1.5 billion devices and serving more than 9.4 billion ad impressions on a monthly basis. Unity has consistently scored 97.8 percent for valid and viewable rates, well above in-app benchmarks of 54.7 percent, providing a rich environment for advertisers while also creating value for developers and gamers alike.*
“Google is committed to helping drive the mobile gaming ecosystem forward, and we’ve worked with Unity for several years on this advertising integration -- a key example for how we enable advertisers to better reach consumers around the world,” said Sissie Hsiao, VP of Mobile Advertising at Google. “We’re especially excited about connecting advertisers with new gamers through user-first rewarded ad experiences, where rewards like new levels or extra power-ups complement the game experience.”
Xenko 3D is now available as open source with the MIT permissive license with a completely free runtime and editor. The announcement was made on the Xenko blog with the release of Xenko 3.0.
In addition to the open source transition, Xenko 3.0 includes a new C# project system and video, hair, and skin rendering. You can view the full release notes here.
Silicon Studio no longer supports Xenko, but members of the Xenko team will continue to work on it independently with the community.
These samples are royalty-free and can be used in television and radio broadcasts, film or video productions, video games, web promos, music tracks, commercial audio and/or visual productions. You can download the free sample pack at Bluezone Corporation (on the main page, scroll to bottom, registration not required): www.bluezone-corporation.com
UBM Tech Game Network, organizers of the Game Developers Conference® (GDC) 2019, is now accepting submissions for the 21st annual Independent Games Festival (IGF), the longest running festival, summit and showcase of independent games.
The IGF is part of the Game Developers Conference 2019, which will take place from March 18th through March 22nd at San Francisco’s Moscone Convention Center. Entries for all of the IGF categories will be accepted now through October 1, 2018, with finalists announced in early January 2019.
See GameDev.net's coverage of the IGF 2018 here and full GDC 2018 coverage here.
Finalists will compete for over $50,000 in prizes and their games will be playable at the packed IGF Pavilion on the GDC 2019 Expo Floor. The award categories include Excellence in Visual Art, Audio, Design and Narrative, each of which will have six finalists, with the winner in each category receiving $3,000. Additional awards include the Best Student Game award ($3,000 prize) which will recognize student developers, the special Nuovo Award ($3,000, eight finalists) which honors the title that 'makes the jurors think differently about games as a medium', and the IGF Audience Award ($3,000 prize) which will be decided by a public vote from all of the competition finalists. Finally, six finalists will be selected for the $30,000 Seumas McNally Grand Prize.
Returning for the third time is the alt.ctrl.GDC Award ($3,000), an affiliated award to be given out during the IGF ceremony, honoring intriguing and inventive games using unique one-of-a-kind controllers. Finalists and the winner for this award will be picked from entries into the popular on-site GDC 2019 exhibit of the same name.
Winners will be announced on stage at the high-profile Independent Games Festival Awards on Wednesday, March 20th, 2019, at the Moscone Center in San Francisco, with the IGF Pavilion open from March 20th-22nd, and the sister Independent Games Summit event taking place on March 18th and 19th.
The Independent Games Festival is the longest-running and highest profile independent video game festival, summit and showcase. It has served as a springboard for several games that have gone on to become critical and cultural hits. Previous IGF prize winners from the past 20 years include Night in the Woods, Quadrilateral Cowboy, Her Story, Papers, Please, Spelunky, Minecraft, Limbo, World of Goo, Braid, Castle Crashers, Everyday Shooter and many more of the game industry's breakthrough independent titles. IGF continues to be the largest annual gathering of independent video game developers, showcasing top talent across the industry and keeping a pulse on the future of independent games.
Submissions to the competition are now open to all independent game developers. Important dates for IGF 2019 are as follows:
July 30, 2018 - Submissions are Open
October 1, 2018 - Submission Deadline
Early January, 2019 - IGF Finalists Announced
March 18 - March 22, 2019 - Game Developers Conference 2019
March 18 - March 19, 2019 - Independent Games Summit @ GDC 2019
March 20 - March 22, 2019 - IGF Pavilion @ GDC 2019
March 20, 2019 - IGF Awards Ceremony (Winners announced)
This year, Independent Games Festival organizers are particularly interested in receiving submissions from those making interesting or experimental works who might otherwise not enter because of the fee. Those under-represented creators interested in applying for a waiver on the full ($75) entry fee can find additional details and the waiver application form on http://igf.com/submit-your-game. Entries should be submitted by September 14th, and if selected, they will be given the opportunity to enter IGF 2019 for free.
For more information on the 2019 Independent Games Festival, including submission specifics and frequently asked questions, please visit the official Independent Games Festival website. IGF entries can be submitted here.
For more information on GDC 2019 in general visit the show’s official website, or subscribe to regular updates via Facebook, Twitter, or RSS. Official photos are available via the Official GDC Flickr account: www.flickr.com/photos/officialgdc/.
Amazon has released Lumberyard Beta 1.15 with over 270 improvements. With the release, the Lumberyard team is focused on how Lumberyard integrates with AWS, enabling game developers of all sizes to connect their games and accelerate game development.
In this release they've added Cloud Gems, making it easier to build popular cloud-connected features using AWS. Cloud Gems allows developers to deploy resources to use AWS in a few minutes and save time from manually configuring services from the ground up.
This video shows what Cloud Gems can do for developers:
Learn more from the Amazon blog post at https://aws.amazon.com/blogs/gametech/1-15/
Epic has released a sample project called the Action Role Playing Game (ARPG), which was built from the ground up to help developers learn more about how to use UE4 to create high-end mobile games for both Android and iOS. Within ARPG and its accompanying documentation, developers will find useful topics including how to utilize C++ and Blueprints together in a UE4 project, setting up and using certain aspects of UE4’s Ability system, and how to support multiple platforms such as Android, iOS, PC, Mac, and consoles.
It is available on the Unreal Engine Marketplace at https://www.unrealengine.com/marketplace/action-rpg.
Some of the topics that this sample covers are.
Utilizing C++ and Blueprints together in a UE4 project.
Setting up and using certain aspects of UE4's Ability system.
How to support multiple platforms like Android, iOS, PC, Mac, and consoles.
Things to Consider:
Due to the complexity of UE4's Ability system, ARPG only utilizes a small subsection of the available features.
Certain aspects of this project also require that you have a good understanding of C++
Learn more at https://www.unrealengine.com/en-US/blog/learn-how-to-develop-high-end-mobile-games-with-the-action-rpg-sample-project.
The Indie Devs Nation is now very pleased to finally announce the release of the version 2.0 of the acclaimed Colossal Game Music Collection on the Unity Asset Store, a truly massive update featuring more than 2 GB of outstanding new game music taken from 20 five star rated music packs (rising the total size to an unbelievable 5 GB!).
This new update took more than 6 months to finish and features diverse high-end music in such genres as Action, Emotional, Apocalyptic, Asian, Heavy Metal, Adventure, Sci-Fi, Casual, Piano and more!
While this collection was already the biggest and most complete of its type, the 2.0 version now makes it absolutely unrivalled in both size and quality!
The greatest news, however, is that, not only has the collection remained nearly at the same price, but we are also having a very special intro sale, selling it with 60% OFF for a very short period of time. So make sure to grab it while you can!
Asset Store link: https://www.assetstore.unity3d.com/en/#!/content/88190
Track previews: https://soundcloud.com/t-i-d-n-music/sets/colossal-game-music-collection
Here's the full list of packs included:
- Action Music Vol. II
- Adventure Music Vol. I
- Heavy Metal Vol. II
- Apocalyptic Music Vol. I
- Medieval Music Vol. I
- Sci-Fi Music Vol. II
- Asian Music Vol. I
- Emotional Music Vol. I
- Casual Music Vol. I
- Action Music Vol. III
- Piano Music Vol. II
- Fantasy Music Vol. I
- Horror Music Vol. II
- Heavy Metal Vol. I
- Sci-Fi Music Vol. I
- Chiptunes Vol. I
- Piano Music Vol. I
- Action Music Vol. I
- Rock Music Vol. I
- Horror Music Vol. I
Arcadia Library is a navigation solution for custom menus using Unity3D, allowing an easy flow between panels.
This plugin allows the management of different menus with functions, storing history to back to the previous panels and dispatching events on each menu status, as: on menu loaded, show, hide, focused and unfocused.
Supports Show and Hide functions.
Menus are separated in ‘groups’, which means that you can initialize only the desired menus instead of maintain all of them on scene at the same time.
‘Layers’ concept. Only one menu can be shown simultaneously by layer, what it means that the current active menus on the same layer will be hidden to made easy the user flow.
Also supports controller input, storing the last selected button/selectable on menus to make easy the navigation flow between panels.
Default show and hide animations. In addition, animations can be fully customized by user.
Based on two classes:
Menu: Each UI panel or menu inherits from this class.
MenuManager: Manages all the menus in the game, offering methods to initialize, show and hide them.
All the menus can be loaded during game initialization or manually by group ID.
Menus are indexed using the type of the classes that inherit from Menu.
Each layer has an stack that stores all the opened menus. Closing a menu of a layer will automatically open the next menu on the stack, if any.
The layer for each menu is defined on the inspector view.
Buttons in the menus are loaded during the initialization and are disabled only during the hide animation of the menu.
More info about the library:
GDPR confused? What is the purpose of GDPR?
On this EnhnaceMyApp podcast, sit in on our round table discussion as our panel of experts (Eldad Ben Tora, CEO of KIDOZ; Eric Dawson, Advertising and Game Analysis for Concrete Software; Zachary C. Strebeck, Attorney at Law; Bruce Gustafson, CEO of Developer’s Alliance) break down everything you need to know to get on board with being GDPR compliant!
This is one discussion worth checking out! Don’t forget, subscribe today! https://goo.gl/xUfD3r
Late last week Allegorithmic introduced the Substance Designer Summer Release, bringing with it several new user-requested features and a tool that will change Substance Designer forever. While artists will immediately note a revamped interface, the changes go far beyond aesthetics. Floating windows, additional nodes and performance enhancements increase the tool’s functionality today, while a new scripting API will soon grant users the ability to reshape Substance Designer to fit their unique needs.
The new API, based on Python (version 3.6.5) is the first step toward granting users the ability to control every aspect of Substance Designer through scripting. Today, users can access, save and export runtime data from the graph, introducing the ability for individual users or fully staffed studios to build their own customized quality checks. Users can also create their own plugins using the scripting tool, or modify existing plugins to meet individual needs.
Future updates will expand the API to include direct customization to a Substance Designer graph, as well as the introduction of full automation.
The Summer Release also features a redesigned UI containing the ability to organize and align nodes vertically, horizontally or snap them to the grid. Additional nodes allow for the conversion of 2D shapes to 3D using the Shape Extrude ability, while the Shape Splatter node creates new opportunities for spreading a pattern throughout a map.
Additional new features include:
Normal Map Intensity – Users can now control the intensity of a normal map
Normal Map Transform – Scale, rotate and offset a normal map while vector values are adjusted automatically
Quad Transform – Manipulate four control points on the corner to transform an image
Trapezoid Transform – A new way to distort an input
New Gradient Tools – Controls points add new options
Improved Flood Fill – An updated version of the recent Flood Fill tool allows it to work with any shape, and can handle holes in patterns
“Today, we begin to turn our users' ideas for a more open Substance Designer into a reality,” said Sebastien Deguy, President and founder at Allegorithmic. “As the API evolves, they’ll be able to tailor Substance Designer to their unique needs, expanding the possibilities of what they create and how they do it.”
Along with the new features, the Summer Release includes improvements to a handful of existing features, starting with enhancements to the Material Transform node used to process the normal map more accurately. The release also introduces an experimental in-context edition of the subgraph, where image and parameter inputs set on node instances are automatically injected into the subgraph when the user opens the reference. The update introduces faster auto-levels, as well as a significant improvement to bakers, which are up to five-times faster than before.
To celebrate the Summer Release, Allegorithmic’s Integrations Product Manager, Wes McDermott, will host a livestream discussing the history of Substance Designer and the people behind it. The stream will be hosted on Allegorithmic’s YouTube page at 11 a.m. PST on July 26.
The new update to Substance Designer is available today at no cost to current subscribers. Following the trial period, individual users will be able to subscribe to the Indie or Pro plans. Subscriptions to Substance Indie cost $19.90/month; Pro plans cost $99.90/month. Substance Designer is also available for individual-license purchase, which includes 12 months of maintenance. Enterprise and education pricing is available upon request. Students and teachers can request a license at no cost.
BrashMonkey LLC is pleased to announce that pre-order for Spriter 2, their 2D animation tool for game creators, is now available.
The recently released teaser-trailer for Spriter 2 can be seen here:
A driving goal in the development of Spriter 2 is to enable creators to achieve a level of rich, natural, and expressive form and movement, rarely seen outside of hand-drawn animation, all in a uniquely fun interface which drastically speeds up the animation process.
Spriter 2 will offer many forms of mesh deformation, curve based and bone based hierarchies and animations, and a large array of video game specific functionality such as animated collision polygons, spawning points, advanced stackable character maps, and several features yet to be revealed.
To celebrate this pre-order milestone, BrashMonkey is running a sale of 35 percent off any purchase until August 3rd when you use coupon code: SPRITER 2 in their online store.
As an added bonus, anyone who pre-orders Spriter 2 during this sale will also receive the full version of Spriter Pro (the current version of Spriter) immediately, free of charge, and will be granted private beta access to Spriter 2 sometime in 2018.
You can visit https://www.spriter2.com/ to learn more and subscribe for further updates.
Or visit https://brashmonkey.com to see their full range of products and access their forums.
What is Cloud Assets?
Cloud Assets is a simple and lightweight but incredibly effective library that allows you to add dynamic content to your Unity games in all the gaming platforms available, PS4, Xbox One, Steam, Windows Store, iOS, Android and even Nintendo Switch!
This is the first and the only library in the Asset Store that supports adding dynamic content simultaneously on consoles, PC and Mobile games. It can be used to:
Cross promote your game with Video, Images, or Text Ads.
Show news to your gamers dynamically.
Display offers, new features, etc
How does Cloud Assets work?
Cloud Assets it's a Unity library that works with a single JSON file stored in a server to serve text, images and videos to use in your projects. You can change the content of your game without having to update the binary. You just need to change the references to the assets in the JSON file in seconds, and your game content will be updated.
What are the features that make Cloud Assets the unique library in the Asset Store?
Use it on any game platform (PS4, Xbox One, Nintendo Switch, Steam, Windows Store, iOS and Android)
Display text, image and/or video wherever you want inside your game.
Open external links in the platform web browser or the device store (mobile, consoles and PC)
Works with a simple JSON file stored in a server.
You don't need to be a programmer! Just change the URL which redirects to your games and adds the text, image and/or video that matches your needs.
Use tags to quickly identify your assets and replace them easily.
You can add title, description and URL in the text asset.
Useful links about the library:
- Unity Asset Store: https://www.assetstore.unity3d.com/#!/content/116697
- Game Troopers Website: http://gametroopers.net/
- Cloud Assets Tutorials: https://www.youtube.com/playlist?list=PLkZtvv5GUSSUVWeVhRJ1tb3MaoZV7ILWP
Epic Games today launched Unreal Engine 4.20, enabling developers to build the most realistic characters and immersive environments across games, film and TV, VR/AR/MR and enterprise applications.
Unreal Engine 4.20 combines the latest real-time rendering advancements with improved creative tools, making it even easier to ship blockbuster games across all platforms. With hundreds of optimizations, especially for iOS, Android and Nintendo Switch - which have been built for Fortnite and are now rolled into Unreal Engine 4.20 and released to all users - Epic is delivering on its promises to give developers the scalable tools they need to succeed.
Artists working in visual effects, animation, broadcast, and virtual production can also take advantage of the latest enhancements for digital humans, VFX, cinematic depth of field and more to create stunningly sophisticated images across all forms of media and entertainment.
In the enterprise space, Unreal Studio 4.20 includes upgrades to the UE4 Datasmith plugin suite, such as SketchUp support, which make it even easier to get CAD data prepped, imported and working beautifully in Unreal Engine. These improvements are driving photorealistic real-time visualization across automotive design, architectural design, manufacturing, and more.
Key features within Unreal Engine 4.20 include:
New Proxy LOD System: Handle sprawling worlds with ease via UE4's production-ready Proxy LOD system for the easy reduction of rendering cost due to poly count, draw calls and material complexity. Proxy LOD yields huge gains when developing for mobile and console platforms.
Smoother Mobile Experience: Well over 100 mobile optimizations developed firsthand for Fortnite come to all 4.20 users, marking a major shift for easy shippability and seamless gameplay optimization across platforms. Major enhancements include improved Android debugging, mobile Landscape improvements, RHI thread on Android and occlusion queries on mobile.
Better for Switch: Epic has significantly improved Nintendo Switch development by releasing tons of performance and memory improvements built for Fortnite on Nintendo Switch to 4.20 users as well.
Niagara VFX (early access): Unreal Engine's new programmable VFX editor, Niagara, is now available in early access and will help developers take their VFX to the next level. This new suite of tools is built from the ground up to give artists unprecedented control over particle simulation, rendering and performance, for more sophisticated visuals. This tool will eventually replace the Unreal Cascade particle editor. Watch this GDC talk to learn more.
Cinematic Depth of Field: Unreal Engine 4.20 delivers tools for achieving depth of field at true cinematic quality in any scene. This brand new implementation replaces the Circle DOF method. It's faster, cleaner and provides a cinematic appearance through the use of a procedural bokeh simulation. Cinematic DOF also supports alpha channel, dynamic resolution stability, and has multiple settings for scaling up or down on console platforms based on project requirements. This feature debuted at the Game Developers Conference as part of the Star Wars "Reflections" demo by Epic, ILMxLAB, and NVIDIA.
Digital Humans Improvements: In-engine tools now include dual lobe specular/double Beckman specular models, backscatter transmission in lights, boundary bleed color subsurface scattering, iris normal slot for eyes, and screen space irradiance to build the most cutting-edge digital humans in games and beyond. All users now have access to the same tools used on Epic's "Siren" and "Digital Andy Serkis" demos shown at this year's Game Developers Conference. Check out the updated Photorealistic Human sample, which has been updated to include the static bust from "Meet Mike," revealed at SIGGRAPH 2017.
Proven, Exclusive Cross-Platform Capabilities: Developers only have to create their game once when building with Unreal Engine. Use the same content and gameplay across any device to deliver experiences anywhere players want to enjoy them.
Live Record & Replay: First shown at this year's Game Developers Conference, all developers now have access to code from Epic's Fortnite Replay system. Content creators can easily use footage of recorded gameplay sessions to create incredible replay videos.
Sequencer Cinematic Updates: Creating real-time cinematic content has never been easier with Sequencer. New features include frame accuracy, media tracking, curve editor/evaluation, anim instance control, and Final Cut Pro 7 XML import/export.
Shotgun Integration: Shotgun, one of the industry's leading production management and asset tracking solutions, is now supported in 4.20. This will streamline workflows for Shotgun users in game development who are leveraging Unreal's real-time performance. Shotgun users can assign tasks to specific assets within Unreal Engine, making the latest content and associated data easily available for review.
Improved Support for Live Broadcast: UE 4.20 now offers greater support for users leveraging the engine in broadcast, from virtual sets to virtual production to eSports. UE 4.20 includes support for AJA Video Systems' KONA 4 and Corvid 44 video cards with a plugin for HD/SDI video and audio input and output, enabling seamless integration of AR and CG graphics in live broadcast transmissions.
Mixed Reality Capture Support (early access): Users with virtual production workflows now have mixed reality capture support that includes video input, calibration, and in-game compositing. Supported webcams and HDMI capture devices enable users to pull real world green screened video into the engine, and supported tracking devices can match your camera location to the in-game camera for more dynamic shots.
Robust AR Support: Unreal Engine 4.20 ships with native support for ARKit 2, which includes features for creating shared, collaborative AR experiences. Also included is the latest support for Magic Leap One, Google ARCore 1.2 support. Be sure to check out the new FaceAR sample project that is now available with this release.
In addition, key improvements to Unreal Studio 4.20 include:
Datasmith for SketchUp: Unreal Studio's Datasmith workflow toolkit for streamlining transfer of CAD data into Unreal Engine now supports SketchUp. Save hours by automatically converting SketchUp cameras, materials, and geometry into UE assets. This plugin also includes support for layers, groups, and components, as well as units, scale, pivot points, and instances. SketchUp materials are automatically converted to a matching SketchUp material in Unreal Engine for greater simplicity.
Metadata Control: Import metadata from 3ds Max, SketchUp, and other common CAD tools for the opportunity to batch process objects by property, or expose metadata via scripts. Metadata enables more creative uses of Unreal Studio, such as Python script commands for updating all meshes of a certain type, or displaying relevant information in interactive experiences.
Mesh Editing Tools: Unreal Engine now includes a basic mesh editing toolset for quick, simple fixes to imported geometry without having to fix them in the source package and re-import. These tools are ideal for simple touch-ups without having to go to another application. Datasmith also now includes a base Python script that can generate Level Of Detail (LOD) meshes automatically.
Non-Destructive Re-Import: Achieve faster iteration through the new parameter tracking system, which monitors updates in both the source data and Unreal Editor, and only imports changed elements. Previous changes to the scene within Unreal Editor are retained and reapplied when source data updates.
For a full features list and additional details, please visit: https://www.unrealengine.com/en-US/blog/unreal-engine-4-20-released