VFX creation in the Cloud
Our client wanted to reduce their onsite hardware expenditure and migrate large portions of their content creation workflow to the cloud. After an in-depth review of their business processes it was agreed that virtualising their visual effects applications would result in the biggest cost saving across the company. Typically, VFX work requires expensive graphics processing unit (GPU) and central processing units (CPU) -loaded workstations which can cost anywhere between $8,000 – $14,000, as well as requiring large amounts of on-premise storage. The VFX applications are linked to on-premise render farms, which again are costly to purchase, store and maintain.
As a significant user of the Azure platform it was necessary to ensure that all solutions were designed natively on Azure.
To ensure frictionless adoption of any solution proposed it was important that we spent time with the end-users to understand their requirements and build a solution that, first and foremost, made their lives easier. After several workshops geared around the end-user requirements it was clear that it was necessary to create a solution that was responsive, scalable and expedited the time it took to render content.
Working within the Azure platform, we architected a Virtual Network consisting of Azure Premium Files Storage and Nv12 GPU-enabled Virtual Workstations. Premium Files was chosen due to its read and write speed of 200mbps. Nv12 Virtual Machine’s were chosen due to the increased performance when using GPU-enabled applications such as Maya. A direct integration of Maya into Azure Batch allowed artists to quickly render content using a cloud-based render farm of 1000 low priority VM’s. These low priority nodes are 80% cheaper, meaning it was quick and affordable to render media for review and approve.
To ensure a seamless end-user experience the cloud environment needed to be linked with their Active Directory. This meant that when an artist logged into a VM, their settings, preferences and plug-ins were available.
The output of the VFX artist’s Virtual Workstation is transmitted to on-premise via Teradici. Teradici PCoIP was chosen due to it’s ability to output a pixel-for-pixel representation, in the correct colour space, of the Virtual Workstation to an on-premise monitor via a Dell Wyse Zero Client. Teradici’s Cloud Access Manager was chosen to act as the cloud broker, to manage cloud compute costs and broker PCoIP connections to remote Windows or Linux workstations. Teradici’s protocol allowed artists to use Wacom tablets with perceived zero-latency.
Whilst it was important to work closely with end-users to ensure the project was a success, it was also key to architect a solution that worked on a global level. Using Terraform templates, Support Partners created a framework for the quick deployment of Virtual Workstations globally – enabling collaboration, utilisation of a remote workforce and greater cost control.
Azure Batch Service is a cloud based job scheduling and compute management platform that enables running large-scale parallel and high performance computing applications efficiently in the cloud.
Azure Nv12 Virtual Machines
The NV-series virtual machines are powered by NVIDIA Tesla M60 GPUs and NVIDIA GRID technology for desktop accelerated applications and virtual desktops. Users are able to run single precision workloads such as encoding and rendering.
Autodesk Maya, commonly shortened to just Maya, is a 3D computer graphics application that runs on Windows, macOS and Linux.
PCoIP is a UDP-based protocol that is host rendered, multi-codec and dynamically adaptive Images rendered on the server are captured as pixels, compressed and encoded and then sent to the client for decryption and decompression.
According to the World Meteorological Organisation, there are 10 different types of cloud, each of which can be divided further into sub-types. They range from the cirrus, the thin floaty clouds which generally serve only to make the sky look beautiful to the...
As a cloud-first company, we have spent the last 15+ years helping many of our clients migrate not only their data but also their workflows and services into a cloud environment. In today’s world, people are using the cloud all the time – sometimes without even...
Big Data: Data Lake vs Data Warehouse Dealing with big data can be a minefield! The volume of data being created on a daily basis is growing exponentially and as our clients are only too aware, the storage and safety of this data is of paramount importance. As more...