this post was submitted on 29 Feb 2024
404 points (99.3% liked)

Linux

48323 readers
834 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

For three years there has been a bug report around 4K@120Hz being unavailable via HDMI 2.1 on the AMD Linux driver.

The wait continues...

you are viewing a single comment's thread
view the rest of the comments
[–] ipacialsection@startrek.website 239 points 8 months ago (3 children)

This really bothers me. Closed standards locked behind a licensing fee may as well not be standards at all, in my opinion.

[–] turbowafflz@lemmy.world 70 points 8 months ago (7 children)

I don't understand why any hardware uses HDMI anymore anyway, what does it have that displayport doesn't?

[–] Dudewitbow@lemmy.zip 69 points 8 months ago* (last edited 8 months ago) (2 children)

HDMi foundation is founded by companies who own the home theatre environement (mainly movie conpanies and television) who puts DRM on HDMI to make it harder to illegally copy content like movies, ao they will always want to be anti open source because thats the request of streaming services/movie businesses. Its why for example, mobile devices have widevine levels. those levels basically determine how "unlocked" the device is and services will refuse to offer full functionality to unlocked devices because of it, be it audio or video.

Members of VESA, who control the displaypprt standard are generally computer companies are mostly not in the business of media, so they value specs over drm on changes, which for example a use case is that displayport allows for daisychaining diaplays.

[–] nivenkos@lemmy.world 36 points 8 months ago (2 children)

The DRM is so stupid - now in the era of streaming you can get literally anything webripped day1.

DRM is obsolete (and it never really wasn't tbh).

[–] smileyhead@discuss.tchncs.de 39 points 8 months ago

DRM is not to stop pirates, but to show investors and licence holders you are trying to stop pirates.

[–] Dudewitbow@lemmy.zip 13 points 8 months ago (2 children)

its the attempt that matters more to investors than the pirates. its why a shit ton of games have denuvo, evem if the version of denuvo they utilized is cracked already or not. its not there for the end user, its there for the investors to show they are at least attempting to fight off piracy.

[–] leopold@lemmy.kde.social 5 points 8 months ago (1 children)

Denuvo is actually very effective relatively speaking. Several popular games that use it have never been cracked. They haven't made it impossible, just sufficiently difficult and tedious that no one wants to bother.

[–] Dudewitbow@lemmy.zip 5 points 8 months ago (1 children)

some aren't cracked because theres like only one person actually doing it, and said person wont crack anime games because she hates anime.

[–] leopold@lemmy.kde.social 2 points 8 months ago

Yes, I'm well aware. Those are the symptoms. I just explained the cause.

[–] Auli@lemmy.ca 1 points 8 months ago (1 children)

Isn’t DRM in games working though. Denuvo only being cracked by one person, to me it sounds like a win for the corporations.

[–] Dudewitbow@lemmy.zip 1 points 8 months ago

it's working in the sense that i slows it down. However how denuvo works is that there are usually are generations of denuvo that get cracked, so once one gets cracked in a generation, theres a handful that will be cracked with it. if a company is using an older generation of denuvo, you may typically see day 1 cracks, which ultimately means the company paid denuvo for nothing, but the point is, denuvo wasn't meant to stop piracy first, it was meant to appease investors that require denuvo to be implemented.

[–] n3m37h@lemmy.dbzer0.com 13 points 8 months ago (2 children)

I don't know a single person who has ever used HDMI to steal copyrighted content. Seriously? Who would rip a 2 hr move by watching it vs the 10 min it takes to rip a movie digitally.

Like shit ya got CAM, WebRIP, BRRIP and SCENE. I doubt HDMI was used in any of these scenarios.

[–] Dudewitbow@lemmy.zip 9 points 8 months ago

technically speaking, every gamer who capture cards to bypass when games on PlayStation has an explicit mode that disables built in recording when a cutscene is active is an example.

[–] Hapbt@mastodon.social 3 points 8 months ago

@n3m37h @Dudewitbow HDMI consortium decides to f around and find out if people really care re: displayport vs hdmi

[–] MiltownClowns@lemmy.world 50 points 8 months ago (2 children)

Decades of being the standard in a/v. That's like asking, why don't we get rid of gas stations and just install electric chargers? Well, everybody's got gas powered cars.

[–] turbowafflz@lemmy.world 19 points 8 months ago (3 children)

AV things sure since they stick around longer, but computers? When was the last time you saw a high end GPU with VGA or DVI? And they already usually have mostly DisplayPort with just one or two HDMI ports

[–] MiltownClowns@lemmy.world 22 points 8 months ago* (last edited 8 months ago) (1 children)

Well, I wasn't referring to that ecosystem. That ecosystem is already on display port. The reason HDMI is so prevalent is because it's the standard in audio-visual equipment. Why would I talk about computer equipment when it's not the standard there?

The point still stands. Everybody has equipment that has HDMI, and to phase out that standard in equipment going forward is phasing out equipment people already own.

[–] MonkderZweite@feddit.ch 1 points 8 months ago* (last edited 8 months ago)

and to phase out that standard in equipment going forward is phasing out equipment people already own.

And where's the problem in that? My parents still use a soon 20 years old plasma tv. But they're getting old too.

[–] krolden@lemmy.ml 7 points 8 months ago

Computers are AV things.

[–] dog_@lemmy.world 1 points 8 months ago

Today. Every time I go downstairs.

[–] TimeSquirrel@kbin.social 9 points 8 months ago (1 children)

HDMI only had about four good years to itself before DisplayPort showed up. In contrast, the RCA port stuck around for damn near 100 years.

[–] n3m37h@lemmy.dbzer0.com 4 points 8 months ago

We also didn't have digital signals till DVI in 1999, HDMI in 2002 and display port in 2006

[–] Flaky@iusearchlinux.fyi 22 points 8 months ago

Probably a lot more hardware using HDMI than DisplayPort? Just throwing a guess, tbh.

That being said, I might consider looking towards DisplayPort when I can get a new monitor...

[–] virr@lemmy.world 9 points 8 months ago (1 children)

CEC (technically I think displayport could support it, but generally isn't implemented) and ethernet up to 100Mbps.

[–] anyhow2503@lemmy.world 15 points 8 months ago (1 children)

Almost nothing uses ethernet over HDMI to my knowledge.

[–] BautAufWasEuchAufbaut@lemmy.blahaj.zone 7 points 8 months ago (2 children)

This is the first time I heard of Ethernet over HDMI and I can't tell if you're joking.

[–] narc0tic_bird@lemm.ee 7 points 8 months ago (3 children)

Feature-wise probably next to nothing, and it's usually behind one or two generations in terms of bandwidth. HDMI is often the only port available on TVs though, so GPU makers likely can't afford to just leave it out.

[–] Grass@sh.itjust.works 8 points 8 months ago* (last edited 8 months ago)

They should anyway. New tech TV's are all smart these days and the dumb ones are made for two decades ago. At this point we are better off with a PC monitor and separate speakers. Built in speakers are shit seemingly as a requirement. I use a video port switch for extra inputs without needing to use the on screen menus or just running out of built in ports.

[–] Hyperreality@kbin.social 2 points 8 months ago* (last edited 8 months ago)

Yep. Very common.

A lot of people use their pc like a console or media server. Ie. use it to watch/play stuff from their bed or couch.

[–] Auli@lemmy.ca 1 points 8 months ago

Why not? If you need it get a converter.

[–] n3m37h@lemmy.dbzer0.com 6 points 8 months ago (1 children)

eARC and 12gbp/s more bandwidth (4k@185hz vs 4k@120hz)

Otherwise the same

[–] SuperIce@lemmy.world 3 points 8 months ago

Your info is outdated. DP 2.0 is 80 Gbps can do 4K@240hz without display stream compression. It can do up to 16K@60hz using DSC.

[–] SchmidtGenetics@lemmy.world 0 points 8 months ago

Can hook up to TVs…

[–] Zucca@sopuli.xyz 2 points 8 months ago

Besed on the upvotes, it's not only your opinion. 👍