On what codec settings?. IMO h256 and AV1 are extremely close depending on individual, and still experimental, codec tweaks.
MP3 coding was near garbage at 128kbps, but over the years the codec community managed to make 128kbps significantly better audibly without changing the format
Next gen h256 coders will likely copy whatever non-bitstream optimisation they can from AV1 coders, and they will be a near match again.
AV1 beats h265 by some 10-15% and it's still improving. h265 has had plenty of time to mature. I don't think there's much headroom for h265 and anyway the JVET/MPEG has moved on to VVC, which beats AV1 by some 25-29% and it is still improving. They know they'll lose the short term war and are looking longer term.
AV1 was supposed to be "on par" with h265 but has jumped ahead by a bit and will likely improve for quite awhile until AV2 is worked on. Will it reach VVC levels? Probably not. But it will certainly be better than h265/HVEC.
Surprised to see Intel using "ARC" as a trademark when ARC international has been around for ages and bought by Synopsys in 2010, which now offers ARC processors. I thought trademarks in the same domain weren't allowed?
I'm double surprised! I really thought av1 was.gonna take a long time to encode. I wonder if it can do the same as before... can it do 4k60? I dunno what existing GPUs can do tbh.
Also surprised the efficiency is t that wildly much of a leap. 50% over h264, 20% over h265. Maybe this is part of the tradeoff when trying to encode realtime, versus offline & in multiple passes. I feel like for offline encoding there's way bigger gains but I dunno!
> Hopefully Apple pushes this over HEVC/VVC garbage in their next release.
I think AV1 is neat, but calling it HEVC "garbage" seems harsh considering the enormous leap H.265 represents over H.264. Even if AV1 is ~20% more efficient than HEVC, lots of "better" codecs have failed to become mainstream.
VVC is great, but AV1 is fast enough for realtime use, while VVC is not. Cisco has had it working for some time. Basically, AV1 to me seems like it's basically a zero tradeoff solution vs older codecs, so is a good "new standard" to base around then think about what is next from here.
> Oh I like HEVC in theory, but its patent encumbrance/licensing has delayed it’s rollout immeasurably.
Has it?
"There's not much to report about HEVC, which remains king of the living room, particularly as it relates to High Dynamic Range (HDR) content. Although there's been lots of chatter about AV1 and other codec support in Dolby Vision, HEVC remains the only 10-bit codec supported in a Dolby Vision profile."[*]
Although HEVC is supported on iOS and Android and has universal support in smart TVs and OTT devices, it doesn't have built-in support on Chrome or Firefox. "The lack of browser support for HEVC doesn't hinder premium content distributors, which can use apps with HEVC playback on most platforms, but it keeps HEVC unusable for publishers targeting broad, browser-based playback."[*]
AV1 will have the opposite problem: Early adoption in Chrome and Firefox, but best case we're looking at 2025 until AV1 sees HEVC-like adoption in the living room, and maybe never on iOS depending on whether Apple adopts AV1 or leapfrogs it with VVC.
Uh? Shipping a phone with Safari built in and being high performance and supporting things like WebAssembly and WebGPU all the way down to the metal is certainly first class
The short version is that Apple overly focused on monetising their app store and doing everything conceivable they could get away with to push people towards it.
If you wanted to release an app for iOS users you needed to purchase Apple hardware, and a yearly subscription just to start developing an app. Then to release the app you had to follow Apple's rules which allowed them to take a 30% cut of all profits generated with your app. If Apple really liked your app, they could just make their own version and kick yours off the app store without recourse so they could keep the other 70% of the profits.
Meanwhile, users and developers could skirt around the app store by using Safari. This represented a direct threat to the largest revenue generating model that Apple had ever created. Apple needed to make sure that their ad campaign: "There's an app for that" always held true.
So Apple did what it could to indirectly reduce the effectiveness of Safari, keeping it behind what other browsers could do. They even made it impossible for other browsers to be installed on their devices by adding rules to the app store that mandated that all web apps had to be powered by Safari's web view. In essence, all "web browsers" on iOS are just reskins of Safari, warts and all.
The title of the article turned into a common saying amongst web developers, an accurate and concise summary of what was (and mostly still is) seen as a cancer on the industry: "Safari is the new IE".
Apple consistently kept their browser years behind competing browsers, either holding back features, bringing out features that didn't address problems developers needed fixed, or bringing out features that were "broken as intended".
This continued until their anticompetitive behaviour ended up the subject of several lawsuits. Apple's lawyers were traditionally able to bat away the many smaller claims levied against them until Epic could pay for enough lawyer's against Apple to try and make a dent.
Regardless of the legal outcome of the lawsuit, it wound up shining a spotlight for several governments to pay more attention to their fair competition laws.
Now Apple is worried about losing their App Store monopoly on their own devices and has put a renewed focus on their web browser to try and keep people using their own products.
Until about 2 years ago, Apple had been content to keep Safari consistently years behind competing browsers. Since then they have accelerated their development, trying to catch up with their betters.
>It's probably a bit early to complain about Apple.
Why? Apple mints new silicon by its own design every year like swiss clock work. Google on the other hand, depends on the likes of Quallcomm, Mediatek and Samsung for the silicon, whenever they decide to add AV1 IP blocks, leading to a chicken and egg situation.
Google could probably fore their hand by switching to AV1 before HW support arrives, but then Youtube UX will suck on all smartphones leading to shit battery life and overheating.
Plus, video codec IP blocks are always a hot potato for the SoC vendors as they're tied in various third party licensing and patent schemes they don't like to deal with.
> Google could probably fore their hand by switching to AV1 before HW support arrives, but then Youtube UX will suck on all smartphones leading to shit battery life and overheating.
Google could also face yet more antitrust action if they continue to use their Youtube monopoly as a weapon against competition as they previously have with Microsoft's Windows Phone and Amazon's Echo Show.
Yeah but given the lead times on Apple silicon, I would expect that if they hadn’t already made the decision to support AV1, then we’re not going to see it for a few years.
Conversely if they made the decision a few years ago, we might start to see support trickle out across their lines starting this year.
They have hardware acceleration for encoding and decoding HEVC in all their products. Why would they promote a new codec without hardware acceleration?
I’m new to codecs coming from frontend webdev mostly JS. is anyone able to point me to some resources on how to know via cli or programmatically to detect if a hardware codec is installed? and if it’s currently being utilized by web video player
the two ways I recently learned is only for listing codecs.
for windows is via legacy media player
for linux is via ffmpeg
and I’m still confused how video players talk to codecs and why they are installed separately and how code can interface with them if they are installed seperately
If you're publishing videos on your own website, making sure there's an mp4 (h264) and maybe a webm (vp8/vp9) source inside a `<video>` element is good enough.
You can probably add an av1 source in 2-3 years.
If you really need to pull things out of a video, you can either read the bytes directly from a blob, or render the video to a canvas and pull the pixel data out of there. Don't worry, there is a 0.00001% chance of you needing to do this.
You will likely never touch anything like a codec from JS ever.