The laid-off employees return, and Meta rebuilds the Metaverse! Publish realistic image data sets, assemble AR glasses on a global tour

**Source:**Xinzhiyuan

**Introduction: **Meta's exploration of the metaverse has not stopped. Recalling laid-off employees, releasing ultra-realistic unreal image data sets, assembling AR glasses around the world...

Image source: Generated by Unbounded AI

Meta's metaverse exploration is still going on.

Recently, Meta has developed a new realistic synthetic dataset with Unreal Engine, hoping to shorten the distance between synthetic data and real-world data.

They also plan to make a new pair of AR glasses that will be used only for internal development and public display.

It is reported that the manufacturing process of the glasses involves factories in mainland China, Taiwan and the United States. The reason is that the lenses contain a military-grade material that cannot be easily exported outside the United States.

In addition, Meta even set up an "ex-employee portal" to slowly recruit former dismissed employees.

Real Unreal Image Dataset

The Photorealistic Unreal Graphic (PUG, Photorealistic Graphic) dataset launched by Meta enables more controllable and robust evaluation and training of artificial intelligence vision systems.

This data set synthesizes more than 200,000 animal images through the Unreal Engine, and hundreds of thousands of images including various poses, lighting and backgrounds, as the basis for training and evaluating image models and image systems.

Because PUG uses the image synthesized by Unreal Engine, it ensures the realism of the image material, which greatly expands its scope of use than similar image data sets before.

PUG consists of 4 parts:

1. PUG Animal

Representation spaces for studying out-of-distribution generalization and studying underlying models, including:

215,040 pre-rendered images, covering 70 animal resources, including 4 sizes, 3 textures, and 4 different orientations.

2. PUG ImageNet

The dataset provides a novel and efficient benchmark for fine-grained evaluation of the stability of image classifiers over multiple variables, including:

151 ImageNet types (classes), 64 backgrounds, 7 sizes, 10 textures, 18 viewing angles, 18 Character Orientation, and 7 light intensities.

3. PUG SPAR (Scene, Position, Attribute, Relationship)

The dataset is used to evaluate visual-language models, showing how synthetic data can be used to address the limitations of current benchmarks. It contains:

43,560 pre-rendered images, 10 backgrounds, 32 animals, 4 relationships (left/right, bottom/top), 4 attributes (blue/red, grass/stone).

4. PUG AR4T

Provides approximately 250,000 images for fine-tuning visual-language models of spatial relationships and attributes.

Meta used Unreal Engine to create a realistic interactive environment from which they could easily sample images of a given spec.

The image below illustrates how Meta uses Unreal Engine and example images to generate the PUG dataset.

Synthetic image datasets offer numerous advantages for designing and evaluating deep neural networks.

Synthetic datasets can render as many data samples as needed, and can also precisely control each scene and produce fine-grained annotation data labels, precisely control the distribution changes between training and testing, to isolate variables of interest for reasonable experiment of.

However, the biggest problem with existing synthetic datasets is that they are not realistic enough, which severely limits the scope of use of the dataset.

However, if a real image dataset is used, it will be subject to privacy and copyright restrictions.

Synthesizing real image data sets through Unreal Engine can solve these problems very well.

The ability to generate data covering a range of domains could make the evaluation and training of visual-language models more reliable than existing test benchmarks, Meta says.

In addition to datasets, researchers can use the PUG environment to create their own, precisely specifying factors such as lighting and viewing angles that are difficult to control with real-world datasets.

How to create your own PUG dataset

Moreover, Meta also introduced in detail how to use Unreal Engine to build its own unique PUG data set.

Instruction manual:

The first is to download Epic Games to get Unreal Engine 5. Then create a new Pixel streaming project.

If you have never understood Unreal Engine before, you can download the official Demo for a simple introduction.

Then read the official introduction to the Unreal 5 engine to understand the basic mechanism of Blueprintsd.

After downloading the Demo, open the DTCharSelect table file located in the Content/Blueprints/CharacterConfig folder, as shown in the figure below.

This table lists all assets that can be loaded through the Unreal Environment. If you want to add new characters, just create a new entry in the table.

AR glasses may be launched next year: military materials, assembled in the United States

In addition, Meta also plans to launch the first generation of AR glasses in 2024, with an output of about 1,000 units.

These 1,000 AR glasses will only be used for internal testing and public display, and will not be released to the public.

Although the number is small, Meta has gone through a lot of troubles to produce this AR glasses.

The glasses are positioned as an expensive spatial computing device.

And because the lenses involve materials that are restricted from export, the hand-held controller and wireless computing core of this glasses will be produced in mainland China and Taiwan, China, and then shipped to the United States, where they will be assembled together with the lenses to form a finished product.

The reason is that Meta plans to use a compound called silicon carbide (SiC) as the lens material in the AR glasses internally code-named "Orion". But the compound is restricted by the government from exporting to other countries.

Compared with past glass materials, silicon carbide can project a wider image into the lens, and the field of view is wider. But it's also more expensive.

The field of view of Orion AR glasses is about 70° diagonally, slightly larger than Magic Leap 2 (66°), and much larger than HoloLens 2 (52°).

Meta's spending so much on a product that won't be released to the public could raise investor concerns about spending by Meta's Reality Labs unit.

Among them, the Reality Labs department mainly develops augmented reality (AR) and virtual reality (VR) products.

However, until now, the market for virtual reality is still small, and the technology of augmented reality is still under development, and the landing scenarios are relatively limited.

Meta invested heavily in these two areas, hoping to establish Meta's leading position in the market.

In both areas, Meta has to compete with a well-funded competitor: Apple's Vision Pro headset.

Meta has cut costs company-wide in light of slowing revenue growth and heightened investor pressure, but Meta's total investment in the Reality Labs unit is still growing.

In the last 18 months, the sector has lost $21 billion this year.

Meta's decision to only ship the first generation of AR glasses as an in-house product is itself a cost-cutting move.

In addition, the selection of factories in China is also a part of cost reduction.

Hiring assembly workers in the United States, the hourly wages range from $16.75 to $28.27. Much higher than the production cost in mainland China and Taiwan, China.

Likewise, other U.S. tech companies that make hardware, such as Apple, also make most of their products in China.

Even so, the cost of the first generation of AR glasses is still quite high because the lenses have to be produced and assembled in the United States.

Re-recruiting laid-off employees

In recent weeks, Meta has slowly started to pick up the pace of hiring, especially for engineering and technology roles.

Since November, employees who have been laid off by Meta have been able to reapply for positions they are currently hiring through a dedicated "ex-employee portal".

Hundreds of jobs are being hired right now, primarily for software, hardware, and AR/VR positions, with some key tech roles in infrastructure and data centers.

A person familiar with the matter said that the operation position does not appear to be liberalized, and the recruitment positions are all positions with specific output requirements, because Meta has reduced the number of manager-level positions as a whole.

Meta recruitment is mainly aimed at employees with rich work experience, reducing the hiring of fresh graduates and interns.

The higher the level and the better the performance evaluation of the laid-off engineer, the higher the probability of being rehired.

Many of the rehired employees will be placed in new positions with lower positions and salaries than before.

One staffer who was rehired by Meta said he took a pay cut of about 10%. But given Meta's share price has been on the rise lately, he expects to reach that level within a year.

Xiao Zha, who has tasted the sweetness of reducing costs and increasing efficiency, seems to be still firmly on the road to the metaverse.

References:

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)