Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Intel becomes the first to offer full AV1 support with its new Arc GPUs (neowin.net)
88 points by clouddrover on April 3, 2022 | hide | past | favorite | 43 comments


Also software decoding of AV1 is already fast enough, so specially what game streaming people need is the AV1 hardware encoder https://openbenchmarking.org/test/pts/dav1d#results


> While AMD and Nvidia both offer AV1 decoding with their newest GPUs, neither have support for AV1 encoding.

So this announcement is specifically about encoding.


> 20% over h265

On what codec settings?. IMO h256 and AV1 are extremely close depending on individual, and still experimental, codec tweaks.

MP3 coding was near garbage at 128kbps, but over the years the codec community managed to make 128kbps significantly better audibly without changing the format

Next gen h256 coders will likely copy whatever non-bitstream optimisation they can from AV1 coders, and they will be a near match again.


AV1 beats h265 by some 10-15% and it's still improving. h265 has had plenty of time to mature. I don't think there's much headroom for h265 and anyway the JVET/MPEG has moved on to VVC, which beats AV1 by some 25-29% and it is still improving. They know they'll lose the short term war and are looking longer term.

AV1 was supposed to be "on par" with h265 but has jumped ahead by a bit and will likely improve for quite awhile until AV2 is worked on. Will it reach VVC levels? Probably not. But it will certainly be better than h265/HVEC.

Source for size comparisons: https://www.cambridge.org/core/journals/apsipa-transactions-...


VVC honestly astounds me. It makes me wonder how far we can go with video compression.

I look back at MPEG-1 Video CDs and it is unbelievable how incredible a video image you could now produce in the same 600Mb storage.


>> Next gen h256 coders will likely copy whatever non-bitstream optimisation they can from AV1 coders, and they will be a near match again.

Not only is that speculation, but more importantly they will not be copying the licensing terms of AV1.


Surprised to see Intel using "ARC" as a trademark when ARC international has been around for ages and bought by Synopsys in 2010, which now offers ARC processors. I thought trademarks in the same domain weren't allowed?


Even stranger is Intel Ark has been their specs repo for a long time. https://ark.intel.com/content/www/us/en/ark.html


The AV1 section of the Intel Arc Graphics announcement video:

https://youtu.be/q25yaUE4XH8?t=598


I'm double surprised! I really thought av1 was.gonna take a long time to encode. I wonder if it can do the same as before... can it do 4k60? I dunno what existing GPUs can do tbh.

Also surprised the efficiency is t that wildly much of a leap. 50% over h264, 20% over h265. Maybe this is part of the tradeoff when trying to encode realtime, versus offline & in multiple passes. I feel like for offline encoding there's way bigger gains but I dunno!


Finally! Given the large overall percentage of web bandwidth that’s video this will really help.

Hopefully Apple pushes this over HEVC/VVC garbage in their next release.


> Hopefully Apple pushes this over HEVC/VVC garbage in their next release.

I think AV1 is neat, but calling it HEVC "garbage" seems harsh considering the enormous leap H.265 represents over H.264. Even if AV1 is ~20% more efficient than HEVC, lots of "better" codecs have failed to become mainstream.

Because VVC appears to trounce* AV1, Apple may just skip AV1. (*https://www.winxdvd.com/video-transcoder/h266-vvc-vs-av1.htm....)

> …this will really help.

An Intel-made laptop GPU is not going to be the thing that makes or breaks AV1.


VVC is great, but AV1 is fast enough for realtime use, while VVC is not. Cisco has had it working for some time. Basically, AV1 to me seems like it's basically a zero tradeoff solution vs older codecs, so is a good "new standard" to base around then think about what is next from here.

https://blog.webex.com/video-conferencing/cisco-leap-frogs-h...


> VVC is great, but AV1 is fast enough for realtime use, while VVC is not.

Today, in software. But yes, definitely worth considering!


>I think AV1 is neat, but calling it HEVC "garbage" seems harsh

That is what the AOM camp has been doing since 2015.


Oh I like HEVC in theory, but its patent encumbrance/licensing has delayed it’s rollout immeasurably.

These realities far outweigh the technical benefits IMO.


> Oh I like HEVC in theory, but its patent encumbrance/licensing has delayed it’s rollout immeasurably.

Has it?

"There's not much to report about HEVC, which remains king of the living room, particularly as it relates to High Dynamic Range (HDR) content. Although there's been lots of chatter about AV1 and other codec support in Dolby Vision, HEVC remains the only 10-bit codec supported in a Dolby Vision profile."[*]

Although HEVC is supported on iOS and Android and has universal support in smart TVs and OTT devices, it doesn't have built-in support on Chrome or Firefox. "The lack of browser support for HEVC doesn't hinder premium content distributors, which can use apps with HEVC playback on most platforms, but it keeps HEVC unusable for publishers targeting broad, browser-based playback."[*]

AV1 will have the opposite problem: Early adoption in Chrome and Firefox, but best case we're looking at 2025 until AV1 sees HEVC-like adoption in the living room, and maybe never on iOS depending on whether Apple adopts AV1 or leapfrogs it with VVC.

*https://www.streamingmediaglobal.com/Articles/Editorial/Feat...


Apple is a stakeholder in HEVC patents so I would expect them to fight to the bitter end against non-Apple invented technologies.


Apple is also a member of the alliance making AV1, The Alliance For Open Media


Apple's history of membership in AOM is complicated™.

They joined late and became a founding member retroactively, but I wouldn't use that as a very strong signal.

Apple is never the first to support open standards.


html5 ? caldav/carddav (their spec though)? usb-c? avahi/mdns? they do sometimes get a standards boner, it's just very specific and strategic.


Yet HTML5 isn't a first class citizen on Apple's most popular operating system, nor is USB-C even on Apple's current best selling product.


Uh? Shipping a phone with Safari built in and being high performance and supporting things like WebAssembly and WebGPU all the way down to the metal is certainly first class


Safari and first class shouldn't really be used in the same sentence.


Well, if you don't like it you can just change the default browser.


You can't on iOS. All browsers (including Firefox!) on iOS are wrappers around Safari, which is lagging behind on many HTML5 features


This thread is about first class HTML5 support. Why is Safari so bad btw?


Well it's kind of a long story.

The short version is that Apple overly focused on monetising their app store and doing everything conceivable they could get away with to push people towards it.

If you wanted to release an app for iOS users you needed to purchase Apple hardware, and a yearly subscription just to start developing an app. Then to release the app you had to follow Apple's rules which allowed them to take a 30% cut of all profits generated with your app. If Apple really liked your app, they could just make their own version and kick yours off the app store without recourse so they could keep the other 70% of the profits.

Meanwhile, users and developers could skirt around the app store by using Safari. This represented a direct threat to the largest revenue generating model that Apple had ever created. Apple needed to make sure that their ad campaign: "There's an app for that" always held true.

So Apple did what it could to indirectly reduce the effectiveness of Safari, keeping it behind what other browsers could do. They even made it impossible for other browsers to be installed on their devices by adding rules to the app store that mandated that all web apps had to be powered by Safari's web view. In essence, all "web browsers" on iOS are just reskins of Safari, warts and all.

Eventually this culminated in an article that basically became a lightning rod for developer's dealing with Apple's anti-competitive behaviour: https://nolanlawson.com/2015/06/30/safari-is-the-new-ie/

The title of the article turned into a common saying amongst web developers, an accurate and concise summary of what was (and mostly still is) seen as a cancer on the industry: "Safari is the new IE".

Apple consistently kept their browser years behind competing browsers, either holding back features, bringing out features that didn't address problems developers needed fixed, or bringing out features that were "broken as intended".

This continued until their anticompetitive behaviour ended up the subject of several lawsuits. Apple's lawyers were traditionally able to bat away the many smaller claims levied against them until Epic could pay for enough lawyer's against Apple to try and make a dent.

Regardless of the legal outcome of the lawsuit, it wound up shining a spotlight for several governments to pay more attention to their fair competition laws.

You can see some recent news about this here: https://www.macrumors.com/2022/03/25/eu-provisionally-agrees...

Now Apple is worried about losing their App Store monopoly on their own devices and has put a renewed focus on their web browser to try and keep people using their own products.

Until about 2 years ago, Apple had been content to keep Safari consistently years behind competing browsers. Since then they have accelerated their development, trying to catch up with their betters.


Ah yes, that Apple that at some point even adopted Jabber. Doesn't really exist anymore...


> avahi/mdns?

Wasn't this more like Apple getting their tech standardized?


Google's own devices don't have hardware support for AV1 playback yet.

It's probably a bit early to complain about Apple.


>It's probably a bit early to complain about Apple.

Why? Apple mints new silicon by its own design every year like swiss clock work. Google on the other hand, depends on the likes of Quallcomm, Mediatek and Samsung for the silicon, whenever they decide to add AV1 IP blocks, leading to a chicken and egg situation.

Google could probably fore their hand by switching to AV1 before HW support arrives, but then Youtube UX will suck on all smartphones leading to shit battery life and overheating.

Plus, video codec IP blocks are always a hot potato for the SoC vendors as they're tied in various third party licensing and patent schemes they don't like to deal with.


> Google could probably fore their hand by switching to AV1 before HW support arrives, but then Youtube UX will suck on all smartphones leading to shit battery life and overheating.

Google could also face yet more antitrust action if they continue to use their Youtube monopoly as a weapon against competition as they previously have with Microsoft's Windows Phone and Amazon's Echo Show.


AV1 isnt just Google, the standard if free for anyone to use.

Google may be guilty of some things, but pushing AV1 being an antitrust violation is not one of them.


Google has certainly used it's Youtube monopoly against Windows Phone and Amazon's Echo Show and Fire TV in the past.

Which makes it unwise to continue to talk about how Google can use their Youtube monopoly again to force it's competitor's hands on anything.


Yeah but given the lead times on Apple silicon, I would expect that if they hadn’t already made the decision to support AV1, then we’re not going to see it for a few years.

Conversely if they made the decision a few years ago, we might start to see support trickle out across their lines starting this year.


With VP9 Apple added decode to their custom hardware and then didn't tell anyone until they started to support it in software a few years later.


> Google's own devices don't have hardware support for AV1 playback yet.

The Tensor SoC in the Pixel 6 supports AV1 decoding:

https://www.anandtech.com/show/17032/tensor-soc-performance-...


They have hardware acceleration for encoding and decoding HEVC in all their products. Why would they promote a new codec without hardware acceleration?


I’m new to codecs coming from frontend webdev mostly JS. is anyone able to point me to some resources on how to know via cli or programmatically to detect if a hardware codec is installed? and if it’s currently being utilized by web video player

the two ways I recently learned is only for listing codecs.

for windows is via legacy media player

for linux is via ffmpeg

and I’m still confused how video players talk to codecs and why they are installed separately and how code can interface with them if they are installed seperately


If you're publishing videos on your own website, making sure there's an mp4 (h264) and maybe a webm (vp8/vp9) source inside a `<video>` element is good enough.

You can probably add an av1 source in 2-3 years.

If you really need to pull things out of a video, you can either read the bytes directly from a blob, or render the video to a canvas and pull the pixel data out of there. Don't worry, there is a 0.00001% chance of you needing to do this.

You will likely never touch anything like a codec from JS ever.


Built in Puerto Rico fabs if I’m not mistaken


No, I don't think TSMC N6 is made in Puerto Rico.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: