Ask HN: Why is there no P2P streaming protocol like BitTorrent?

190 points by memet_rush a day ago

I've been wondering if anyone knows why there is no P2P protocol for mass live stream content in decent quality? specifically what are the technical limitations or is it mostly that people don't want to get destroyed by media company lawyers? I've searched around for a while and i cant find anything like that that can handle thousands of people streaming. The closest is probably Webrtc and that looks like it can only handle 500~ peers.

I was thinking most people nowaday have at least 30mbps upload and a 1080p stream only needs ~10mbps and 720p needs ~5ish. Also i think it wouldnt have to be live, people would definitely not mind some amount of lag. I was thinking the big O for packets propagating out in the network should be Log(N) since if a master is sharing the content then is connected to 10 slaves, then those connected to 10 other slaves and so on.

The other limitation I could think of is prioritizing who gets the packets first since there's a lot of people with 1gbs connections or >10mbps connections. Also deprioritizing leechers to keep it from degrading the stream.

Does anyone have knowledge on why it isn't a thing still though? it's super easy to find streams on websites but they're all 360p or barely load. I saw the original creator of bittorrent was creating something like this over 10 years ago and seems to be a dead project. Also this is ignoring the huge time commitment it would take to program something like this. I want to know if this is technically possible to have streams of lets say 100,000 people and why or why not.

Just some thoughts, thanks in advance!

bawolff 16 hours ago

Part of the reason bit torrent works really well is that the file is downloaded in random order. It lets everyone cooperate, while still being robust to bad peers, bad network connections, churn etc.

If you want live high quality streaming, a lot of reasons bit torrent works so well goes away.

Latency matters. In bit torrent if the peer goes away, no big deal, just try again in 5 minutes with another peer, you are downloading in random order, who cares if one piecs is delayed 5 minutes. In a live stream your app is broken if it cuts out for 5 minutes.

In bit torrent, everyone can divide the work - clients try to download the part of the file the least number of people have, quickly rare parts of the file spread everywhere. In streaming everyone needs the same piece at the same time.

Bit torrent punishes people who dont contribute by deprioritizing sending stuff to peers that freeride. It can do this on the individual level. In a p2p streaming setup, you probably have some peers getting the feed, and then sending it to other peers. The relationship isnt reciperocal so its harder to punish freeriders as you can't at the local level know if the peer you are sending data to is pushing data to the other nodes it is supposed to or not.

I'm sure some of these have work arounds, but they are hard problems that aren't really satisfactorily solved

  • arghwhat 14 hours ago

    This comments on why bittorrent as is isn't used for live streaming, not why P2P shouldn't be used for live streaming

    > Latency matters. In bit torrent if the peer goes away, no big deal, just try again in 5 minutes with another peer, you are downloading in random order, who cares if one piecs is delayed 5 minutes. In a live stream your app is broken if it cuts out for 5 minutes.

    First of all, BitTorrent clients do not download in random order or wait 5 minutes. They usually download the rarest block first, but can do whatever they want, whenever they want.

    Second, standard HLS sets a nominal segment size of 6 seconds (some implementations will go as high as 10 seconds), and a client will usually cache multiple segments before playing (e.g., 3). This mean that you have 18 seconds before a segment becomes critical.

    This is not a difficult thing for a P2P network to handle. You'd adapt things to introduce timing information and manage number of hops, but each client can maintain a connection to a number of other clients and have sufficient capacity to fill a segment if a connection fails. Various strategies could be used to distribute load while avoiding latency penalties.

    Low-latency HLS uses much smaller segments and would be more demanding, but isn't impossible to manage.

    > BitTorrent punishes people who dont contribute

    Private communities punish this behavior, BitTorrent clients do not. Most new downloads will appear as freeriders for a long time, and only over long periods far exceeding the download time will enough compatible seeding opportunities arise for them to contribute in any substantial way.

    The network does not need everyone to seed, it only needs enough people to seed.

    • bavell 8 hours ago

      > First of all, BitTorrent clients do not download in random order or wait 5 minutes. They usually download the rarest block first, but can do whatever they want, whenever they want.

      The problem here is that BT works so well because the clients default to "good behavior" (prioritize rare pieces first) and discourages "bad behavior" (leeching/no upload).

      This tilts the balance on the whole enough to maintain the health of the network. If you change these, you'd need to introduce other mechanisms to preserve the network incentives.

      • arghwhat 8 hours ago

        With streaming, the service provider is generally also controlling the clients, so they should be much better off when it comes to client behavior.

        In case of bring-your-own-client, the incentives are exactly the same: Clients would likely default to good behavior as network health equals user experience, and exactly like BitTorrent there will be neither punishment nor need for it if some clients disobey.

        • bawolff 4 hours ago

          > exactly like BitTorrent there will be neither punishment nor need for it if some clients disobey.

          "Punishment" (tit-for-tat algorithm) is one of the defining features of bit torrent, especially in comparison to what came before it.

          • arghwhat an hour ago

            This is not a "punishment", and what clients to differs greatly.

            The original spec for the original client allocated a portion of its bandwidth to random peers instead of known-good/preferred peers, so if you had no chunks you were basically bandwidth and/or peer restricted.

            If you take the arch linux ISO right now and put it into aria2c to be a new, unknown client with no data to offer, you'll find that while it takes a few seconds to join the network, fetch metadata and connect to peers, you'll quickly saturate your connection completely without ever uploading a single byte.

            If you wanted, a streaming network could use direct access or low-hop access as seeding incentive - seed to get slightly lower content latency. When the streaming client is controlled by the content provider, seeding is easily forced and topology could be controlled centrally.

        • mannyv 5 hours ago

          In video streaming the client really controls everything...unless you have your own client and customize its behavior.

          You should see how people try and get HLS to pick a stream. With the default players it's not possible - the client does it.

          • arghwhat an hour ago

            Yes that's the intended behavior of HLS - the content provider advertises which streams are available and at what bitrates, and the client picks which to use based on the current stream performance and its own capability.

            The server can control the stream by advertising a customized manifest to individual clients, although it's a bit silly. HLS is designed to be extremely easy to load distribute and throw CDN/cache proxies in front, and it's a bad sad that content providers are this bad at load management. :/

            Either way, the assumption here is that you would swap out the client doing HLS with a client designed to use a P2P protocol, including the seeding portion and network health management.

    • mannyv 6 hours ago

      Nobody who does live streaming should set their gop to 6 seconds. That's way too big. I set ours to 2 seconds, which lets mobile adjust more quickly to their bandwidth limits.

      And as a note, video segments for live are usually set to no cache, as are vod segments. The CDN does the caching. The client should keep segments around if you're doing rewind, but that's a client thing.

    • PaulHoule 10 hours ago

      I have a relatively slow ADSL connection, it's not unusual for me to be able to download 100% of the file at say 95% the theoretical rate without uploading anything. If the network has enough upload capacity to do this, does it really need my upload? (Note my client is still there if somebody needs a rare block)

      I remember some Bittorrent networks circa 2005 or so which tried to monitor you and punish you for not contributing and this was a disaster for me since my upload is necessarily a small fraction of my download. What I found is that kind of network seemed to perform poorly even when I had a good bidirectional connection. As I saw it, people who have the ability to upload more than they download are a resource that that kind of network can't exploit if everybody is forced to upload as much as they download.

      • arghwhat 7 hours ago

        You are not required to seed immediately, nor even the same asset. Unless the rules have changed, you would usually just be required to maintain a particular minimum seed lifetime ratio that is easily achieved by idle seeding a library.

        The point is to ensure network health with a metric that is simple to understand and verify: that you have been a productive. If you aren't seeding, someone else has to pick up the slack and the network didn't benefit from you obtaining the blocks.

        The community itself benefits by giving members a guarantee that stuff there is available with abundant bandwidth, instead of relying purely on the unpaid goodwill of others.

  • Protostome 15 hours ago

    If I remember correctly, PopcornTime was able to stream via BT. Your claims are mostly correct, but I think some compromises can be made to make BT clients streaming proof. For example:

    1. locally-randomizing segments download order

    2. Create a larger buffer

    3. Prioritize parts coming from slower connections

    • lordnacho 15 hours ago

      This is still just streaming a static file though. Adjusting which segment to get will work, buffering will work, and people don't mind their movie starting a few seconds after they press play.

      If I'm streaming live, I need the frame immediately, and it doesn't help much to get later frames after the frame I'm missing.

      • Protostome 11 hours ago

        Live streaming is, by nature, a "one-to-many" distribution model, where content flows from a single source to many viewers in real time.

        BT, on the other hand, is fundamentally designed for "many-to-many" distribution, where peers share pieces of content with each other over time. This isn't just a question of tweaking the protocol—it's a fundamentally different problem. Unless you're willing to compromise on the immediacy of the stream, using BT for true live streaming isn't really a good fit.

      • jack_pp 15 hours ago

        what if everyone agrees on a 10s delay?

        • bawolff 14 hours ago

          The issue is everyone is watching the same part of the file at the same time. Offsetting by 10 seconds does not change that.

          • Timwi 13 hours ago

            The time offset impedes the ability of viewers to interact with the streamer via chat, which for many people (incl. myself) is the whole reason to watch live instead of a recording.

        • lordnacho 14 hours ago

          For pseudo-live streams such as sports events, that would be totally fine. People can have slightly out of sync streams, delayed by various amounts.

          But you can't live stream a conversation with someone if you have a 10s delay.

    • eklavya 15 hours ago

      The main difference is liveness of the stream. Live streams are much much more difficult and much less forgiving.

    • earnesti 10 hours ago

      I remember testing popcorn time and other bittorrent streaming tools back in the days. They worked "OK". Yes, you don't get the netflix experience. But on popular titles you get "good enough" streaming experience. You have to wait like 30 secs to get started.

      • fermentation 7 hours ago

        Part of this was the fact you needed a pretty decent connection speed and that the files themselves were extremely compressed

  • karel-3d 15 hours ago

    I am not convinced about the random order stuff. If most people will stream from the start then the start will be more seeded? So it's all good.

    And the order is requested by the client, and there are clients that go in the sequential order, like Deluge

    • mihaic 15 hours ago

      The benefit of a random order is that it forces you to actually keep all the packets, which makes upload more likely. Streaming lets you get away with not storing the whole file, which makes bad actors more likely.

      And, sure, some BT clients can stream the data, but what the default is makes a huge difference.

      • hinkley 5 hours ago

        People will shut off the moment they get the full file. The randomness means that the last packet is as likely to exist as the first.

        Would you want to watch the beginning of something that didn’t have an ending? How frustrating would that be?

        • scott_w 5 hours ago

          > People will shut off the moment they get the full file.

          Perhaps but the time spent downloading it is also time spent uploading some of the file, so there's still some benefit. By having it in random order, you more evenly distribute the people with access to different parts of the file.

          With streaming, if everyone downloads the same blocks at the same time, "bad actors" can dump all data they already watched to save disk space, harming potential peers that are watching slightly behind.

          • hinkley 3 hours ago

            Proof of work has problems with this because you (Mallory) can be paid to be the tertiary durable store for a file and secretly fetch the file from Alice or Bob when asked to prove you have the files. And even if you do something like use a different cypher for each copy the fact that the data is often meant to be public means one could work out the cypher given Alice and Bob and then dump your copy once you have done so.

            Unless you use public key cryptography, which is so expensive that nobody actually uses it for arbitrarily large inputs.

    • bawolff 14 hours ago

      If we are talking about a live stream (and not a stream of a static file), having the start be more seeded is useless.

      • Calwestjobs 12 hours ago

        mpeg stream / TS filetype / DVB-T/C/S broadcast IS static file, all 3 is same format, this format deals with every point you made... download specs and educate yourself.

        same with streaming audio, chunk IS static file, so every phone call you made last 30 years is static file.

        • bawolff 3 hours ago

          The issue is not how the bits are divided into packets (or "files") but how those packets are distributed/used.

          Obviously at the end of the day its a string of bytes, like everything is, the difference i'm trying to get at is differences in how the data is used and requested.

          Its more a social difference not a technical one.

        • Calwestjobs 11 hours ago

          and there is no reason for bittorrent to not being able to send/download 3000 TS chunks / static files for 3 hour movie in sequential order.

          and no reason for your MPV/VLC/PotPlayer to not render that in sequential order.

          even when you have only first 2 pieces.

        • imtringued 10 hours ago

          The word "file" is doing a lot of heavy lifting here.

          • Calwestjobs 6 hours ago

            what is difference what kind of data chunk it is? 0 difference. or for university educated - he introduced file as a helpful abstraction, so i work with that abstraction, if you say it is not good abstraction then tell him. calling ts file a file is absolutely correct in any sense of word. just to be thorough. in philosophical debate we are having in computer science. ( yes toxic sarcasm )

            that one post is more to the topic of OP is asking for, than 90% of comments here.

            again NEWS, movies, comedy, trumps tarrifs, are streamed digitally to bilions of people over dvb-t/c/s every day. if how are bits ordered/chunked is important for you that much, that this already working system is not good for you in that sense ? makes no sense.

            or explain little more, one sentence explaining whole world is k12 like. or 42 for book readers.

            • bawolff 3 hours ago

              I'm using "static file" to mean something pre-recorded that even if users will view in order, they will likely start at different times, so different users will be viewing different parts of it.

              In contrast to a live stream where everyone is viewing the same part at the same time, and once that part passes nobody is likely to view the old part ever again.

              This makes a big difference in terms of network design.

    • delusional 14 hours ago

      If everybody starts at the start then it will be very poorly seeded when everybody wants it, before the being well seeded right when nobody needs it.

      • Calwestjobs 12 hours ago

        "well seeded" means what exactly?

        if i can send 2 copies of piece to 2 people immediately as i got it, then if my download takes 20 ms and sending it another 20 ms is it "well seeded" for those 3 people after 50 ms? or after how much time it is "well seeded" ?

        • jsnider3 8 hours ago

          Downloading and reuploading part of a file in 50 ms is optimistic and that is still only three people when serious live-streaming platforms have to deal with thousands of viewers routinely and millions every so often.

        • delusional 11 hours ago

          A precise answer to that question entails more math than I'm willing to do right here in the middle of my Easter holiday. You should understand my argument more as a sketch of proof.

          That being said I have a small correction. If you want to stream to two peers (that is you have a network with 3 fully connected nodes, one being the originator of the stream) and the link latency for all links is 20ms, then your lowest latency path to each node is exactly 20ms, as the originating node could simply send the content to each client.

          The unfortunate realization is that 20ms is then also the optimal stream delay, assuming you care about the shortest possible latency for your live streaming service. The clients will therefore end up with the content exactly when they were supposed to show it, and therefore they would have no time to forward it to anyone else, lest that downstream would get the content AFTER they were supposed to show it.

    • Calwestjobs 12 hours ago

      yes you are correct, BitTorrent - sequential download also works exactly like that.

      people seem to have need for 0ms nano ultra low latency streams for watching movies,... they are insane. they want to be extraordinary high speed traders but with movies not stocks. insane

  • HumblyTossed 6 hours ago

    Just for shits and giggles, I wonder if a protocol similar to HLS, where each peer is assigned a segment to transcode / cache, would work for best cases. If a peer drops, that segment is assigned to another peer. Or if there are, say, 5 peers, double up on the next segment to increase odds it will be done.

    Just a mental curiosity is all.

  • delusional 14 hours ago

    I don't think people are appreciating the nuance of what you're saying. Most of what you are saying isn't accurate for netflix style streaming, which would actually be more aptly called "video on demand", but is very applicable to "live streaming" in the sense of live sporting events or news broadcasts.

    Video-on-demand is perfectly implementable on top of BitTorrent. As you say, there are some latency pitfalls you'll have to avoid, but that's nothing you can't hack yourself out of.

    Livestreaming is a different beast. As you say, the problem with livestreaming is that everyone needs the same content at the same time. If I spend 200ms downloading the next 500ms worth of content, then there's nobody to share it with, they all spent the 200ms doing the same. BitTorrent relies on the time shift that is allowed between me downloading the content and you requesting it. If you request it before I've got it, well I can't fulfil that request, only the guy I intend to get it from can.

    If you wanted to implement something like that, you would probably pick a tree of seeders where the protocol would pick a subset of trusted nodes that it would upload the content to before then allowing them to seed it, and the have them doing the same recursively.

    That would obviously introduce a bunch of complexity and latency, and would very much not be BitTorrent anymore.

    • bawolff 3 hours ago

      I generally agree, but once you do that you lose most of the properties that make bit torrent so effective.

      E.g. if you arrange the network into a tree like that, you need to make sure all nodes are matched appropriately in terms of bandwidth, latency, geography, number of connected nodes. Now you have to somehow the network topology stays good in face of churn and bad peers. Suddenly everything is complicated and not looking very p2p.

      Maybe dufferent protocols are possible to manage that, but i think there is a reason why p2p protocols kind of didn't develop much beyond bit torrent.

    • ralferoo 12 hours ago

      I don't think he's saying it needs to be BitTorrent, just applying some principles from it.

      For example, say you have a cluster of people on the call in the US and another cluster in the UK. Ping times are 100ms or more across the ocean, there will be some random packets lost, but ping times within the UK are around 15ms max. By working co-operatively and sharing among themselves the clients in one cluster can fill in missing packets from a different cluster far quicker than the requesting them from the originating host.

      In general, the ability to request missing packets from a more local source should be able to improve overall video call quality. It still might be "too late", because for minimal latency, you might choose to use packets as soon as they arrive and maybe even treat out-of-order packets as missing, and just display a blockier video instead, but if the clients can tolerate a little more latency (maybe a tunable setting, like 50ms more than the best case) then it should in theory work better than current systems.

      I've been mulling over some of these ideas myself in the past, but it's never been high enough on my TODO list to try anything out.

      • delusional 12 hours ago

        > By working co-operatively and sharing among themselves the clients in one cluster can fill in missing packets from a different cluster far quicker than the requesting them from the originating host.

        That's only true if you assume the nodes operate sequentially, which is not given. If the nodes operate independently from one another (which they would, being non-cooperating) they'd all get a response in ~100ms (computation and signaling time is negligible here), which is faster than they could get it cooperatively, even if we assume perfect cooperation (100ms for the first local node + 15ms from there). It's parallelism. Doing less work might seem theoretically nice, but if you have the capacity to do the same work twice simultaneously you avoid the synchronization.

        Basically, it falls somewhere in my loose "tree based system" sketch. In this case the "trusted" nodes would be picked based on ping time clustering, but the basic sketch that you pick a subset of nodes to be your local nodes and then let that structure recursively play out is the same.

        The problem you run into is latency. There's no good way to pick a global latency figure for the whole network, since it varies by how deep into the tree you are. As the tree grows deeper, you end up having to retune the delay. The only other option is to grow in width at which point you've just created a another linear growth problem, albeit with a lower slope.

  • calvinmorrison 7 hours ago

    > Part of the reason bit torrent works really well is that the file is downloaded in random order.

    That's basically true for one client (transmission) - who specifically refuses to allow linear ordering. Most clients implement this.

    To enable it, its about a 3 sloc change.

    I hate clients that dont work for the user.

    • wing-_-nuts 7 hours ago

      >That's basically true for one client (transmission) - who specifically refuses to allow linear ordering. Most clients implement this.

      Transmission does allow you to set a particular file (let's say the first file in a series) as 'high priority' though so it's not like they don't allow any change to the behavior

      • calvinmorrison 6 hours ago

        good for them. they specifically refuse to implement this feature even though its a tiny change for the 'good of the ecosystem' or something stupid.

PaulRobinson 13 hours ago

For a while, I was CTO of a company called Livestation [0], which as the Wikipedia article states, was "originally based on peer-to-peer technology acquired from Microsoft Research".

This P2P stack was meant to allow for mass scaling of lowish latency video streaming, even in parts of the World with limited peer bandwidth to original content source servers. The VC-1 format got into a legal quagmire, as most video streaming protocols do, and it speaks volumes that by the time I turned up in ~2012-ish, the entire stack was RTMP, RTSP, HDS and HLS with zero evidence of that P2P tech stack in production.

My main role was to get the ingest stack out of a DC and into cloud, while also dealing with a myriad of poor design decisions that led to issues (yes, that 2013 outage in the first paragraph of the wiki article was on my watch).

At no point did anybody suggest to me that what we really needed to fix our attention back to was P2P streaming, beyond the fact the company built a version of Periscope (Twitter's first live streaming product), and launched it weeks/months before they did, and were pivoting towards a social media platform, at which point I decided to go do other things.

The technical and legal problems are real, and covered elsewhere here. People want reliable delivery. Even Spotify, YouTube and others who have licensed content and could save a pile by moving to DRM-ified P2P don't go near it, and that should tell you something about the challenges.

I'd love more widespread adoption of P2P tech, but not convinced we'll ever see it in AV any time soon, unfortunately.

[0] https://en.wikipedia.org/wiki/LiveStation

  • garganzol 9 hours ago

    I used LiveStation from time to time, and just for fun I was playing around with finding out when and how it employed P2P protocols. Needless to say, I never found any evidence of P2P in LiveStation. And now I know why :)

    Thank you for bringing up the warm memories I thought I no longer had.

    • PaulRobinson 7 hours ago

      Thanks for supporting a business that was pretty cool, once. I bailed as it got into the consumer livestream space, but sometimes think about resurrecting it as a higher-quality OTT app that isn't rammed with absolute junk and ads. The work that platform did during the Arab spring was significant, and I can't honestly point to a good modern alternative today.

  • apitman 7 hours ago

    You bring up a good point. It's interesting that YouTube at least doesn't do p2p for their non-DRM content.

    • googlryas 4 hours ago

      Sounds like a recipe for dissatisfied users

      "Why's my internet slow? Oh, YouTube is uploading a bunch of stuff to other people"

      "How did I hit my bandwidth cap for the month already? Oh, youtube is..."

      • apitman 4 hours ago

        Those problems are implementation specific

        • bawolff 3 hours ago

          I don't see how you implement p2p without the p2p part.

andruby 9 hours ago

Splitcast Technology built this in 2012. The company folded (couldn't find revenue, and had founder struggles ) but as far as I remember the tech worked. It still needed a lot of seeding nodes, but a significant chunk of the bandwidth was provided by the "viewer peers".

Key part of that tech was that it synchronized the playback between all peers. That was nice for stock market announcements and sport events for example.

https://web.archive.org/web/20131208173255/http://splitcast....

https://www.youtube.com/watch?v=R5UYu9jeQbY

https://www.crunchbase.com/organization/splitcast-technology

martinald 11 hours ago

The real reason is that bandwidth is dirt cheap, if you know what are you are doing at scale.

For 'hobbyists' there is a lot of complexity with setting up your own streaming infrastructure compared to just using YouTube or Twitch.

Then for media companies who want to own it, they can just buy their own infra and networking which is outrageously cheap. HE.net advertises 40gbit/sec of transit for $2200/month. I'm oversimplifying this somewhat, you do have issues with cheap transit and probably need backups especially for certain regions. But there isn't much of a middleground between hobbyists and big media cos.

For piracy (live sports streams), I've read about https://en.wikipedia.org/wiki/Ace_Stream being used for this exact purposes FWIW. This was a while back but I know it had a lot of traction at one point.

  • imtringued 10 hours ago

    This is basically the answer.

    Minimum latency broadcast forms a B tree. A tree is by definition not peer to peer. The number of branches per node is upload speed divided by bandwidth of the stream. This branching factor is extremely low for residential internet with asymmetric high download and low upload speeds.

    Once you add malicious adversaries in the P2P network or poor network connectivity, each client will need to read multiple streams via erasure encoding at once and switch over, when a node loses its connection.

miyuru 16 hours ago

There is peertube and webtorrent, but they does not seem to catch the mainstream users.

In my opinion, NAT and the extensive tracking that has led users to distrust sharing their IP addresses are the reasons why it hasn't caught on.

Imagine YouTube using P2P technology, it would save lot of money spent on caching servers.

  • bawolff 16 hours ago

    Peertube and web torrent aren't doing live streams as far as i know, just stream of pre-recorded video, which is still a lot harder for p2p than random order download, but not in the same ballpark as a livestream.

    > Imagine YouTube using P2P technology, it would save lot of money spent on caching servers.

    I think its money well spent.

    • boudin 15 hours ago

      Peertube supports live streams https://framablog.org/2021/01/07/peertube-v3-its-a-live-a-li...

      There is a lag between the source and the audience, maybe it's been improved in the last 4 years though, not sure.

      • bawolff 14 hours ago

        Hmm interesting i didn't know that.

        I couldn't find much docs on how it works, just https://docs.joinpeertube.org/contribute/architecture#live

        Sounds like they break the stream into very small segments and publish each of those with bit torrent (?), they seem to claim about 30 second delay and scale in the hundreds but not thousands. Certainly impressive if true, i wouldnt of thought such an approach would scale so well. Of course its still a far cry from twitch, but nonetheless impressive.

miohtama a day ago

There was Joost in 2008, from Skype founders. Skype was originally P2P until Microsoft acquisition and killing this legally questionable feature - need to feed the big brother (: Joost raised ~$50M.

I remember it as it was one of rare apps built in XUL, the same framework as Mozilla apps (Firefox).

https://en.m.wikipedia.org/wiki/Joost

  • pests 8 hours ago

    PriitK of Joost/Sykpe was also involved in the original Kazza. I've told previous stories of his involvement in a 1990s MMO spaceship game, where he recreated the original buggy/cheat-filled client (Subspace) and released the now standard replacement (Continuum).

  • whalesalad 9 hours ago

    I distinctly remember living alone in my first apartment and trying to use Joost like a TV. It was a dismal experience.

elmerfud a day ago

People have tried to build BitTorrent clients to do this. As far as I know they never took off. The primary problem is you oftentimes don't get people who want to share back or who have firewalls or other connections that don't allow them to share back. So you end up with a few people who end up seeding everything out. The second problem is in order to watch a streaming protocol things need to arrive in order. It is totally possible to do with BitTorrent and request the blocks in the order that you want but you may not always be able to get them in the order you want.

In general people aren't tolerant of lag and spinning circles and other such things when they're trying to watch streaming content. If you're fine with just watching it a little bit later might as well queue it up and left the whole thing down load so it's ready when you're ready.

  • jeroenhd 15 hours ago

    Popcorn Time did this and it worked great. Starting a torrent wasn't instant, but once a buffer was built up, it streamed just fine.

    Popcorn Time got taken down pretty hard because they became too popular too fast.

    A commercial solution could have a seed server optimized for streaming the initial segments of video files to kickstart the stream, and let basic torrents deal with the rest of the stream.

    • SXX 11 hours ago

      Popcorn Time still works and used by everyone who cares. It's just not as hyped anymore.

  • bayesianbot 16 hours ago

    The biggest issue I've seen with these is the networking limitations in a browser - there might be hundreds of seeders for a video and using a normal streaming torrent video player works well, but as torrent clients in the browser need to use WebRTC / WebTorrent, there might be just 0-5 seeders supporting it. I don't see much adoption for WebTorrents before the widely used standard Bittorrent clients support the protocol.

  • memet_rush a day ago

    what about having something reasonable for lag, like 30-60 seconds would that make a big difference or you think it would just eventually degrade too? Also do you think there's any way you can prioritize seeders in such a protocol? like some kind of algorithm that the more you share the more you're prioritized in getting the most up to date packets.

    The main reason I would think it would be useful is 1. since streaming sites seem to lose a lot of money and 2. sports streams are really bad, even paid ones. I have dazn and two other sports streaming services and they still lag and are only 720p

    • bawolff 15 hours ago

      > what about having something reasonable for lag, like 30-60 seconds would that make a big difference or you think it would just eventually degrade too?

      I think you would probably need something more in the neighbourhood of 10 minutes to really make a difference. If you could make a stable p2p live streaming app with the number of peers all watching the same stream in the hundreds and only 30 seconds latency, i'd consider that pretty amazing.

      > Also do you think there's any way you can prioritize seeders in such a protocol? like some kind of algorithm that the more you share the more you're prioritized in getting the most up to date packets.

      If we are talking about a livestream (and not "netflix" type streaming) then i don't think seeders are a thing. You can't seed a file that isn't finished being created yet.

      If you mean more generally punishing free-riders, i think that is difficult in a live stream as generally data would be coming in from a different set of peers than the peers you are sending data out to, so its difficult (maybe not impossible) to know who is misbehaving.

      • pests 9 hours ago

        Apparently PeerTube can do 10s delay to hundreds (not thousands) of viewers.

    • dave4420 a day ago

      With sports streams you specifically want low lag, don’t you? It’s no fun being spoilered by people cheering (or not) next door.

      • memet_rush a day ago

        i wouldn't mind a minute of lag tbh if the quality and reliability was better. I'm pay $20 a month for dazn and it still lags and buffers lol

  • BiteCode_dev 17 hours ago

    stremio works fine and is quite popular.

    It's similar to popcorn time that was killed by legal ways so I'd say they did take off.

    Stremio smartly avoids being killed by making pirating an optional plugin you have to install from another site so they get deniability.

    It works well and save my ass from needing 1000s' of subscriptions.

    • SchwKatze 14 hours ago

      I was going to cite stremio too, it's far from perfect but it works fine most of the time.

reliablereason a day ago

The only entities that could use such a thing are major streaming platforms, and projects trying to stream copyrighted content without consent.

The former don't want to use it as it degrades their control over the content, and the later don't want to make a new system cause systems that are built on torrents are good enough.

  • littlestymaar 17 hours ago

    I worked for a company called Streamroot which sold exactly this, and I can tell your first paragraph is indeed correct but the second isn't: we had major streaming platforms as customers when I was there (not global giants like Netflix or YouTube, but big european players like Canal+ or Eurosport) and we also had plenty of warez websites (streaming sport, animes, porn, etc.).

    I then left and the company later got acquired by Level 3 so I don't know exactly how it evolved but it's likely that they abandoned the illegal streaming market for reputational reasons and stuck with big players.

    • LeonM 14 hours ago

      > I then left and the company later got acquired by Level 3 so I don't know exactly how it evolved

      It just struck me that there are probably plenty of large media companies that use all sorts of proprietary video streaming products for distribution that we've never heard of, simply because the tech isn't available to consumers.

      Media companies are generally pretty secretive about their tech (Netflix being the exception to this rule), so there isn't much to be found about this. The piracy community (because, let's be real here) also won't be interested in a non-free (speech and beer) streaming solutions like these. So that's probably why there is just very little public information available.

      But if you use paid digital TV products (Eurosport being a perfect example here) then you are probably already using all sorts of P2P streaming protocols you've never heard of.

  • aaron695 19 hours ago

    > degrades their control over the content

    Encryption (can work with sharing), signatures, fall back to CDN. Control is not an issue.

    > torrents are good enough.

    Torrents can't do the massive market of livestream, like sports or season finales or reality TV / news. This is the entire point of the question.

    > The only entities

    And everyone kicked off of YouTube or doesn't want to use big corporations on principal, like Hacker Cons or the open source community.

    • notpushkin 17 hours ago

      > Encryption (can work with sharing), signatures, fall back to CDN. Control is not an issue.

      And of course if an encryption key gets leaked, you can just rotate it. Since it’s a stream, past content is not as important.

      (That said, I don’t think it will help — any DRM can be cracked, and there’s plenty of online TV streaming sites even with the current centralized systems.)

      • Calwestjobs 12 hours ago

        you can stream blockbuster movie which got released yesterday. DRM is important.

    • Calwestjobs 12 hours ago

      "Torrents can't do the massive market of livestream" CAN do, why are we not using them is not technical reason, it is that most people just pay apple tv / netflix and not have to install anything on their computer, and UI / interface is 10000 times better.

      or very similar point - i had conversation with some big youtuber and person was confused why he is not more popular with certain demographic. reason was that said demographic was watching on big TV and content he was filming was big head directly in front of camera. so they do not like having 3 feet big head right in front of them... most young people watch things on mobile..

notepad0x90 17 hours ago

There are streaming sites on the high-seas that use webtorrent. Interestingly (at least for me), this bypasses firewall based IPS/inspection that looks for bittorrent because it's all https. People use it to stream movies at work lol. Good for them I guess.

rklaehn 14 hours ago

I am a contributor to Iroh ( https://github.com/n0-computer/iroh ), an open source library for direct QUIC connections between devices that can be behind a NAT.

Our library is general purpose and can be used whenever you need direct connections, but on top of Iroh we also provide iroh-blobs, which provides BLAKE3 verified streaming over our QUIC connections.

Blobs currently is a library that provides low level primitives and point to point streaming (see e.g. https://www.iroh.computer/sendme as an example/demo )

We are currently working on extending blobs to also allow easy concurrent downloading from multiple providers. We will also provide pluggable content discovery mechanisms as well as a lightweight content tracker implementation.

There is an experimental tracker here: https://github.com/n0-computer/iroh-experiments/tree/main/co...

Due to the properties of the BLAKE3 tree hash you can start sharing content even before you have completely downloaded it, so blobs is very well suited to the use case described above.

We already did a few explorations regarding media streaming over iroh connections, see for example https://www.youtube.com/watch?v=K3qqyu1mmGQ .

The big advantage of iroh over bittorrent is that content can be shared efficiently from even behind routers that don't allow manual or automatic port mapping, such as many carrier grade NAT setups.

Another advantage that BLAKE3 has over the bittorrent protocol is that content is verified incrementally. If somebody sends you wrong data you will notice after at most ~16 KiB. Bittorrent has something similar in the form of piece hashes, but those are more coarse grained. Also, BLAKE3 is extremely fast due to a very SIMD friendly design.

We are big fans of bittorrent and actually use parts of bittorrent, the mainline DHT, for our node discovery.

Here is a talk from last year explaining how iroh works in detail: https://www.youtube.com/watch?v=uj-7Y_7p4Dg , also briefly covering the blobs protocol.

wmf a day ago

This tech has been developed several times but ultimately CDNs are now so cheap that P2P is pointless. You can't ignore development cost since it dominates all other costs in this case.

  • globular-toast 13 hours ago

    If CDNs are so cheap, why is YouTube insistent that they should get paid for their bandwidth? I already pay for my bandwidth and am quite happy to use it for something like YouTube.

    The real reason is centralised architecture gives them control and ability to extract rent.

    • crazygringo 10 hours ago

      > why is YouTube insistent that they should get paid for their bandwidth?

      What are you talking about?

      YouTube has a lot more costs than bandwidth. And a lot of ads and Premium revenue goes to creators.

netsharc a day ago

AceStream is P2P, its primary use is to stream pirated live sports though. But looking it up, it seems to have been infected by "blockchain!" geniuses.

  • nisa a day ago

    It still works without any blockchain and there are dockerfiles and images for using it with CLI only on github. It's closed source through and the UI was a forked version of VLC - it's also been suspected to spread malware - CLI tools look fine through but who knows.

    Surprisingly the channels that are available work really well if you just use the mpegts stream.

    In a past life I've added a few channels to a tvheadend instance on a VPS. It reliable crashed Kodi watching some channels and I've wondered if it's just broken streams or something more interesting is going on.

    If you open the ports and watch popular channels it's easily saturating bandwidth - there is no limit.

    I've since stopped using it it's the kind of thing that breaks not often enough to be not useless but often enough to be annoying.

    It's IPv4 only and seems to use it's own tracker or at least calls to some URLs for initial peer discovery.

    Building something similar as true open source would be great but I guess the usecase is mostly illegal streaming.

    Be careful - it's attempting to use upnp to open ports on the router and even if just looking through the lists makes you upload fragments.

    Still fascinating tool. It's getting to close to what op is looking for but I think it has scalability issues and everything about it is kind of shady and opaque.

  • extraduder_ire 8 hours ago

    One of the main things that hindered acestream is probably it remaining mostly proprietary. There was an explosion of different bittorrent implementations past the first one, which forced everyone to properly standardise.

    I was hopeful about bittorrent-live when that was announced, but they didn't open source that for some reason either.

Imustaskforhelp a day ago

There's iroh.computer which can use a relay/ do direct nat punching.

They use bao hashing which is something that I discovered through them (IIRC) and its really nice.

Could create such a protocol though bittorrent/ipfs is fine

I once wanted to create a website which was just a static website.

and I used some ipfs gateway to push it with my browser and got a link of that static website, all anonymous.

Kind of great tbh.

  • Imustaskforhelp a day ago

    Shame it's being abused by crypto bros who want to treat it as money.

    There are other genuinely useful crypto projects (like Monero for privacy and I don't like the idea of smart contracts)

    I really want to tell you the fact that most crypto is scam. These guys first went into crypto and now I am seeing so much crypto + AI.

    As someone who genuinely is interested in crypto from a technology (decentralization perspective)

    I see transactions as a byproduct not the end result & I see people wanting to earn a quick buck feel really weird.

    Also crypto isn't safe. I just think like now its better to correlate as a tech stock though 99% of the time, its run by scams, so absolutely worse.

    The technology is still fascinating. But just because the technology is fascinating doesn't mean its valuable. Many people are overselling their stuff.

    That being said, I have actually managed to use crypto to create a permanent storage (something like ipfs but its forced to store it forever) , so I think this can be used where anonymity/decentralized is required. But still, this thing could be done without including money in the process as well & crypto is still not as decentralized as one might imagine.

    • rklaehn 15 hours ago

      > Shame it's being abused by crypto bros who want to treat it as money.

      Iroh contributor here. I don't know what you are referring to. Iroh is just a library to provide direct QUIC connections between devices, even if they are behind a NAT. We don't have any plans doing a blockchain or an ICO or anything like that.

      I am not aware of any project called Iroh that is a scam, but if there is, please provide a link here. It's not us.

      I know there have been some scammers trying to make a BLAKE3 coin or something, a year ago.

      • Imustaskforhelp 13 hours ago

        I actually wasn't referring to iroh but rather ipfs / the stratos thing that I mentioned.

        My only gripe with iroh currently is that its browser wasm feels too much for me/ I don't want to learn rust.

        So I actually wanted to build something that required connectivity and I used nostr because nostr is great for website and not gonna lie ,its awesome as well (but nostr is also riddled with crypto bros :( )

        • rklaehn 12 hours ago

          OK, thanks for the clarification.

          I have nothing against crypto in principle, but I really don't want Iroh to be associated with crypto scams.

          Iroh is just a library for p2p connections. You can use it for crypto, but I would say that the majority of our users are non-crypto(currency).

          We will try to make the wasm version easier to use, but if nostr works well for you, go for it! Not the right place if you want to avoid crypto bros though :-)

        • Geee 8 hours ago

          Haven't used Nostr recently, but isn't it associated with bitcoiners rather than crypto bros? At least it used to be that way.

Dibby053 a day ago

It is a thing.

For livestreams there's AceStream built on BitTorrent, but I think it's closed-source. They do have some SDK but I never looked into it. It's mostly used by IPTV pirates. I've used it a few times and it's hit-or-miss but when it works well I have been able to watch livestreams in HD/FullHD without cuts. Latency is always very bad though.

Then for video-on-demand there are some web-based ones like PeerTube (FOSS) and I think BitChute? Sadly webtorrent is very limited.

PotterSys 7 hours ago

There are companies doing something like that (StreamShark, Quanteec, Eyevinn; even Cloudflare with WHEP). On a company I worked on, they used Eyevinn for events with more than 100K users; and there were still performance issues.

Besides bandwidth problems (as you can't 100% rely on remote connections), any P2P solution would mean the same fragment will be shared many times between clients; something CDN networks have solved already (just serving content, instead of juggling with signalling)

nottorp 9 hours ago

The thing is, if you pirate why would you stream when you can download and watch at your convenience?

And if you pay for the streaming, why would you donate your bandwidth to them? Would you get a discount?

  • glxxyz 8 hours ago

    "if you pirate why would you stream when you can download and watch at your convenience?"

    Live events, e.g. sports?

    "why would you donate your bandwidth to them?"

    I don't know but people donate bandwidth for torrents, maybe it's 'free' for them?

    • nottorp 7 hours ago

      And? If you're into it, you already pay through the nose for those. Let them pay for the bandwidth.

      • glxxyz 7 hours ago

        "If you're into it, you already pay through the nose for those"

        I believe pirating is seen an alternative to paying through the nose?

        I pay through the nose for most live sports I watch.

1970-01-01 10 hours ago

It's been tried a few times. The bottleneck still exists in the last mile of the network. Users simply don't have good enough equipment to reliably handle this amount of streaming data.

  • rollcat 9 hours ago

    Yep, real-world HLS/DASH solutions rely heavily on edge caches; you don't care as much about having your infra in the right region, but POPs are everything.

    Netflix famously offers ISPs an appliance.

  • zinekeller 9 hours ago

    One of the problems is that even "symmetric" connections are usually not symmetric at all (obviously talking about residential connections here, not DIA).

hwpythonner 5 hours ago

I think the missing piece here is why we’d want P2P live streaming in the first place.

If the goal is to cut costs — like vendors trying to avoid AWS/CDN bills — that’s a very different problem than building for censorship resistance or resilience.

Without a clear “why,” the tradeoffs (latency, peer churn, unpredictable bandwidth) are hard to justify. Centralized infra is boring but reliable — and maybe that's good enough for 99% of use cases.

The interesting question is: what’s the niche where the pain is big enough to make P2P worth it?

RedNifre 13 hours ago

This used to exist in 2008 and it was perfect. It was called Joost and worked like this: - P2P streaming - You get a big menu of channel logos - You click on a channel and it would start the first episode of a randomly chosen series from that channel, or it would continue from where you left off - There might have been a "zap" function? I'm not sure - The GUI was so nice and large that if you connected a Wii Remote to your PC, you had the best TV experience from your couch, ever: Just press a button to bring up the menu, aim at the channel you want to switch to, done.

Such a shame that it failed, nothing after it ever came close.

pdubouilh 12 hours ago

I built a proof of concept doing exactly that a few years ago[0]. The codebase is unmaintained and the demo website has been down for a wee while too, but it's basically this idea. Only issue is that the overhead to establish WebRTC connection is heavy, so it's not exactly lightweight...

0: https://github.com/pldubouilh/live-torrent

alganet a day ago

BitTorrent is already a streamable P2P protocol. You just need a client that can prioritize downloading the file parts in order.

It is a thing.

  • wmf a day ago

    It doesn't work for live streaming without modifications though.

    • alganet a day ago

      Fair enough.

      For live streaming there is WebRTC. It is also a thing.

      • bawolff 14 hours ago

        But its not really p2p in the sense the original poster meant (as in its not an overlay network). Its p2p in the sense that tcp/ip is p2p, not in the sense that bit torrent is.

  • syndeo a day ago

    I believe Popcorn Time worked this way, but I may be wrong. Never dug too deeply into it.

    • alganet a day ago

      It did.

      All of it started with the webtorrent project though. One of the first demos was booting Ubuntu while streaming the incomplete live ISO image, quite impressive for the time.

      This is great tech for media files. Currently better than any other. But making it would make those media files very easy to redistribute, and it is hard to change that without loosing the P2Pness goodies.

      If Popcorn Time had a synchronized multi-resolution catalog, bandwidth-sensitive auto switch and some paid seed servers, it would be better than any other streaming service (technically speaking).

karel-3d 15 hours ago

You need good latency for streaming, torrents can get to a decent speed but the latency will always be bad.

Modern streaming protocols sometimes go to absurd lengths to avoid too many hops so you get the data as soon as possible... torrent has so many jumps and negotiations to get to the actual file. It's good for decentralization but decentralization and efficiency go against each other.

dcow 17 hours ago

A lot of comments mention it has been a thing in various forms through the internet’s brief history. The interesting question is why didn’t it take off—especially when the technology was there.

One possibility as you allude to is licensing. In a P2P streaming model “rights” holders want to collect royalties on content distribution. I’m not sure of a way you could make this feel legal short of abolishing copyright, but if you could build a way to fairly collect royalties, I wonder if you’d make inroads with enforcers. But overall that problem seems to have been solved with ads and subscription fees.

Another data point is that the behemoths decided to serve content digitally. Netflix and Spotify showed up. The reason the general population torrented music is because other than a CD changer, having a digital library was a requirement in order to listen to big playlists of songs on your… Zune. Or iPod. That problem doesn't exist anymore and so the demand dried up. There was also an audiophile scene but afaik with Apple Lossless the demand there has diminished too.

And finally, since people were solving the problem for real, we also entertained big deal solutions to reduce the strain on the network. If you stream P2P your packets take the slow lane. Netflix and other content providers build out hardware colocated with last mile ISPs so that content distribution can happen even more efficiently than in a P2P model.

In short: steaming turned into a real “industry”. Innovators and capitalists threw lots of time and money at the problem. Streaming platforms emerged, for better and for worse. And here we are today, on the cusp of repeating the past because short sighted business mongers have balkanized access with exclusive content libraries for the user numbers.

m-s-y 10 hours ago

> I was thinking most people nowaday have at least 30mbps upload

Even “modern” cities like NYC are limited to a MAXIMUM of 30Mbps upstream due to ISP monopolies and red tape.

It’s getting better, but Spectrum is still literally the only ISP available for many city residents, and their offerings are so lopsided that their highest-end package is a whopping 980/30.

That’s right. If you use the majority of that 980Mbps your IP overhead will gladly take that 30Mbps, leaving you with just about Zero headroom for anything else.

benlivengood 13 hours ago

It's hard to beat CDNs for streaming because the number of hops is low. Basically any p2p technology would have to mirror the way existing livestreams work; a local well-connected peer sucks down the livestream from farther away and rebroadcasts it to local peers. Anything else introduces latency or wastes WAN bandwidth. Peers are also rarely situated where they have moderate downstream and exceptional (local) upstream.

IPv6 multicast is probably the way forward for livestreams but I haven't really been keeping up on recent developments. In theory there could be dynamic registrations of multicast addresses that ISPs could opt-in to subscribe to and route for their customers.

  • nicman23 13 hours ago

    hops and latency in general does not really matter in streaming context after the first few seconds, which are a bottle neck due to connecting to peers that might or might not exist

  • Calwestjobs 13 hours ago

    ( not a criticism of you/your post )

    it is insane to me, for people to have need to watch toxic channels like LinusTechTips livestream, regurgitating weeks old toxic marketing disinformation and having need to have that 0ms latency... XD

    why everyone needs low latency for one way stream? unnecessary hurdle just to have that hurdle. no benefit to anything.

    but agree with you that if companies already forget existence of IPv4, internet will be simpler, faster and more usable. for less price for everyone.

jannw 12 hours ago

Lots of technical discussions - but the real answer is that bittorrent/P2P was displaced by Netflix for all but a small number of hard-core users. That, combined with legal threats, and that p2p required volume/scale to work well, meant that the critical mass died. It was a sad day that we, the users of the internet, en-mass exchanged bittorrent for streaming companies.

  • ValdikSS an hour ago

    This, and:

    * Asymmetric network links, slow upload especially on cellular

    * Traffic package limitations, and both DL and UL are counted

    * Some ISP are very against p2p, sometimes it's a government policy (China banned "Residential CDNs")

    * NAT

silcoon 11 hours ago

Have you checked out webtorrents? You can download movies from the P2P network sequentially and so watch them while they download.

LargoLasskhyfv a day ago
  • memet_rush a day ago

    looks interesting! surprised something like that never caught on. I looking for something like Twitch basically. It has really good quality and is live. But obviously Twitch is just losing money and using all Amazons resources so I wanted to see if there's a more sustainable p2p approach

    • toast0 a day ago

      For massive video distribution, getting acquired by a company with "infinite bandwidth" is the sustainable approach.

      Orchestrating p2p realtime video distribution is going to have a lot of problems, and spend VC money until someone acquires you is just a lot easier.

      Here's a small list of challenges you'd face:

      You'll need to have a pretty good distribution network to handle users who just can't manage to p2p connect.

      Figuring out the right amount of user's bandwidth you can use without people getting upset; there's a lot of internet accounts with bandwidth quotas, especially for mobile

      Trying to arrange so that users connect to users with the least transmission delays would be needed to reduce overall latency. Between cross oceanic connections having unavoidable latency, the potential of buffer bloat, and having a reasonable jitter buffer, pretty soon you have wild delays and potential rebuffering.

      Bandwidth constraints / layer switching is going to be a big challenge; it's one thing when your server can just push the best stream the client can manage, but if you're streaming from a peer and the stream is too big, the peer probably doesn't have a smaller stream to switch to and there's no good way to know if where the bandwidth constraint is ... maybe you should switch to the same stream from someone else or maybe you should switch to a smaller stream. Can you get even packets from one peer and odd packets from another ... should you?

    • LargoLasskhyfv a day ago

      What do you mean by never caught on? It's 'live' at https://joinpeertube.org/ where you can either go to https://joinpeertube.org/browse-content and put something into that search form, or limit that search to specific 'instances' under https://joinpeertube.org/instances

      Or to get back to your original question: https://docs.joinpeertube.org/use/create-upload-video

      edit: Your'e not limited to these addresses, for one there are other instances, for another you can selfhost your own, if your'e into that.

      Technically that is one of many possible solutions, 'ready to roll' right now.

      addit: Regarding sustainability, and who is behind it, maybe https://framasoft.org/en/ would be of interest?

      Linked from there https://framablog.org/2024/12/17/peertube-v7-offer-a-complet...

      and

      https://framablog.org/2025/04/10/2025-peertube-roadmap/

      • memet_rush a day ago

        Thanks! i will just check it out.

        I just meant like never caught on as in like it's not super popular, but looks like it's on the come up. would be nice to have a real youtube competitor lol

        • LargoLasskhyfv a day ago

          Yes. It's the typical 'hen & egg' problem. I'm watching there from time to time, and even found some things (independent trance/ambient/goa music) which didn't exist on YT at all! Though the selection is limited, compared to YT or whatever, it's less algorithmic, and because of this you're not forced to use most exaggerated grimacing or clickbaity titles, IF you have no commercial interests and give a shit about ads.

          If that's your thing. And you have some sort of presence online elsewhere, then you can link to peertube, no matter which, or selfhosted, without problem.

          That's why I pointed you to it. If you need/want the most massive audience, because of platform familarity/network effect, then probably not. At least not now. But someone has to start somehow :)

xbmcuser 16 hours ago

I think there is acestream though I don't think it could do 1000s of users. It was my go to for watching live sports back in the day I dont watch sports so no longer kept up with it.

kevinmhickey 10 hours ago

The degree of difficulty for building this is the premise of the 2001 movie Antitrust from back in the old days when Microsoft was evil. Notable as one of the first movies to use Linux desktops and reasonably correct shell script and code in all of their screen shots.

johanvts a day ago

Octoshape developed this tech, I believe it was sold to some american tv networks.

Calwestjobs 13 hours ago

you can install Qbittorrent client, after loading torrent file/magnet link, right click on that torrent and click on Download in sequential order.

torrent PROTOCOL does not require to download random pieces, in random order.

ONLY bittorrent, inc. COMPANY which releases "Utorrent" and "Bittorent" NAMED APPLICATIONS/PROGRAMS does not want legal trouble from media/music companies. Because STREAMING is other legal category then downloading. There is no other reason for torrent PROTOCOL to not deliver file pieces in sequential order.

if you need instant nanosecond delayed stream, those does not exist anywhere, even radio, tv stations over the air are delayed so they all transmit synchronized. so 0 latency and synchronized can be mistaken for each other.

  • ValdikSS an hour ago

    The OP by live streaming means broadcasting, as in live event.

  • _flux 11 hours ago

    I believe there is actually a reason why you want to transfer random blobs instead of from the start: it is waste of resources when a node needs to upload the same block multiple times to the network, if it could be uploading different blocks.

    > if you need instant nanosecond delayed stream

    I believe nobody was suggesting that.

    • Calwestjobs 5 hours ago

      what chunk is transmitted when, is not important technically, (programatically, it is doable, i do not know how to call that properly)

      it is ONLY important when you need to not have people (SWARM) finishing downloading of torrent then closing torrent client app and not sending data chunks to next person.

      BUT everyone is saying it is stream and has to be instant showing of picture/video.

      so i do not understand why all those people in other comments are caring about state of swarm if we do have thousands people watching and everybody is saying there are big amounts of people watching but still caring about swarm..

      (swarm thing is important with normal use case of BitTorrent, irrelevant for streaming)

      i understand what they are saying, they do not understand that they are saying nonsense.

    • extraduder_ire 8 hours ago

      The usual download strategy is to request the rarest pieces first, in random order. Most modern clients will prefer earlier pieces when the swarm/availability is above a certain size.

      "super seeding" is a different feature where a seed won't upload more pieces to a peer unless a previously uploaded piece has been distributed to another peer first.

teddyh 10 hours ago

This has been invented many times, but every time someone invents an actually effective method of person-to-person file transfer, it gets used for piracy and blocked and shunned.

GTP 12 hours ago

A related thing: wasn't PopcornTime using bittorent to stream movies? Did it have any notable difference from Bittorent or any notable issue/drawback? I never used it, but it sounded like a big thing at the time.

giorgioz 15 hours ago

There are streaming platforms built on top of BitTorrent like Streamio with the torrents plugins

greenavocado a day ago

The only way this will be possible is if there is widespread adoption of an Internet overlay network similar to Tailscale in its design. Fortunately or unfortunately depending on how you look at it Tailscale is limited to Layer 3 so Multicast doesn't work (it depends on IGMP to function correctly).

  • globular-toast 16 hours ago

    Why Tailscale? Are you aware of IPv6?

    • jeroenhd 14 hours ago

      AFAIK IPv6 multicast across the internet is pretty much dead. ISPs seem to block it because of its DDoS potential. They use it themselves of course (very useful for streaming live TV across their private VLANs) but as an outsider you'll have to convince every ISP and backbone provider to trust your multicast stream, which they probably won't.

      Tailscale (or any other P2P overlay network) could solve this problem by re-enabling the multicast support that most ISPs block. It's not a terrible idea.

      Edit: a comment elsewhere linked https://www.librecast.net/librecast.html which seems to be doing exactly this.

    • greenavocado 9 hours ago

      Transit and peering agreements between ISPs typically exclude multicast, meaning packets are dropped at network boundaries.

  • Tepix 16 hours ago

    Why do you think so?

gunalx 16 hours ago

You could utilize webtorrents, with a master seed, to in theory get really high scaling of content download, as one would scale peers at the same time of leetchers. Add a cordination server on top to sync timings and you should be there.

dp-hackernews a day ago

Isn't that what multicast is for?

  • wmf a day ago

    Multicast doesn't work on the Internet.

    • rapnie 16 hours ago

      You might have a look at Librecast [0] which is a R&D project funded by Horizons Europe NGI0 programme via NLnet, aiming to the bring multicast to the current unicast internet and smoothen the transition of projects that adopt it. A great intro to multicast and Librecast is given in Brett Sheffield's 2020 LinuxConfAU talk "Privacy and Decentralization with Multicast" that is available on Peertube [1].

      > To enable multicast on the unicast Internet we start by building an encrypted overlay network using point-to-point links between participating nodes. Once established, our overlay network can run whatever protocols we require, unimpeded by routers and middleboxes and which is resistant to interception, interference and netblocks.

      [0] https://www.librecast.net/librecast-strategy-2025.html

      [1] https://spectra.video/w/9cBGzMceGAjVfw4eFV78D2

      • jeroenhd 14 hours ago

        The protocol seems like an excellent idea, but #3365a3 on black for the website text is one of the worst designs for open-source project websites I've seen yet.

        Off-topic but I'm impressed with how many potentially revolutionary projects get funding from NLNet.

  • protocolture 17 hours ago

    I had a teacher in uni who was fairly convinced that some kind of intelligent multicast was the solution here.

    But after working in ISP for a while I realised that the issue is getting ISP's to use cool protocols is just impossible and everything must be built at higher levels.

  • memet_rush a day ago

    i guess but im thinking like multicast with the people sharing like bittorrent, just live. so you'd need to factor in people leaving and people leeching

    • dp-hackernews a day ago

      So a multicast like derivative that is peer aware and can redistribute locally any available parts - which would require some sort of caching, which would probably break copyright etc... So perhaps that's the reason why nothing exists. \o/

oldgregg a day ago

Build it. Use Go. Maybe nknorg/nnet for P2P. Signed HLS segments. Have Go also serve the web front-end with a WASM web worker. Public nodes can run on a very lightweight VPS/server with an autocert domain. Viewers browser join the swarm with WASM-- this way people can just type in a web address so it's very user friendly but the domain doesn't actually have to serve any data. I would just use a trusted pubkey to sign P2P updates so nodes can block naughty IP addresses. Should get you very friendly user experience, easy node deployment, pretty low latency, and bittorrent level of legal resilience.

greenavocado a day ago

I would not be surprised if the rise of CG-NAT put another nail in the proverbial coffin of P2P video streaming and related sharing.

immibis a day ago

What does PeerTube do?

i5heu 17 hours ago

Peertube does this AFAIk (or they plan to do this)

guerrilla 12 hours ago

Is that not what PeerTube is?

behringer 6 hours ago

You can stream over bittorrent... I do it all the time, you start your download, set it to download in order, and then just start watching it.

Nadya 17 hours ago

There are at least two projects like this for watching anime. I won't name them in this forum but they do exist if you look for them.

ramesh31 7 hours ago

There are plenty. The problem is that it's pointless and far less effective. Centralized servers work great. The only viable reason for P2P we have found over the last 20 years seems to be illicit activity. Everyone else is just fine with regular servers.

dboreham 7 hours ago

For the same reason that BitTorrent doesn't work: providing a service costs money, and when there's no way to collect money then you get no service.

globular-toast 16 hours ago

It's easier to control people with a centralised architecture.

slashink 18 hours ago

Latency.

  • jeroenhd 14 hours ago

    Twitch streamers seem to be fine with the 10-60 second latency Twitch adds, depending on how bad their network is performing. Requirements will differ per industry but I don't think latency is a killer necessarily.

    • slashink 9 hours ago

      Twitch does not add 10-60 seconds of latency. The average latency with default OBS settings is 3-6 seconds.

      Source: I worked on the Twitch video system for 6 years.

Am4TIfIsER0ppos a day ago

> Also i think it wouldnt have to be live, people would definitely not mind some amount of lag.

I work on low latency and live broadcast. The appropriate latency of any video stream is the entire duration of it. Nobody else seems to share this opinion though.

paulcole 10 hours ago

(Outside of niche circles like HN) Nobody uses computers anymore. Nobody is going to seed from their phone.

Plus instead of a million people all wanting to watch Spider-Man 2, those million people have infinite options of short videos or whatever to watch. The desire to watch A Specific Video isn’t what it used to be.

Times have changed and P2P as a common way of sharing stuff is dead to the average person.

cess11 15 hours ago

As I understand it, P2P requires information sharing so when your distribution network grows this eventually turns into a performance bottleneck. You'll also need rather sophisticated mitigations against bad actors in the distribution network, like nodes that forward bad packets instead of distributing the good packets they receive and got requests for.

You might want to look into the tradeoffs Discord decided to go with, https://discord.com/blog/how-discord-handles-two-and-half-mi....

Here's some boilerplate for rolling your own, https://blog.swmansion.com/building-a-globally-distributed-w....

In theory you could gain resilience from a P2P architecture but you're going to have to sacrifice some degree of live-ness, i.e. have rendering clients hold relatively large buffers, to handle jitters, network problems, hostile nodes and so on.

defdefred 17 hours ago

Peertube?

  • notpushkin 17 hours ago

    I don’t think it has live streaming?

Szpadel a day ago

I would see that easiest way to bring something like that would be some adaptation of m4u format, just instead of URLs to video it could have URL to torrent/magnet.

one issue I can imagine would be that each part would discover peers independently where assumption that most peers of previous parts should be expected to also have those files.

second idea would be to use ipfs in that way instead of torrent. that would probably have much easier time for reusing peer discovery between parts and also would solve issue when to stop seeding as this is already build in into protocol.

I guess that creating distributed twitch basing on ipfs would be feasible but not sure how many people would like to install ipfs node before that could use that. that's kind of chicken and egg problem, you need a lot of people before this system starts work really well, but to get interest it need to really perform well so people would migrate from twitch like services.

ofc you can use public gateways. afaik cloudflare have public ipfs endpoint that could serve as fallback

  • immibis a day ago

    I would think that the easiest way would be to not use torrents, because torrents have fixed top-level hashes. Instead, create a new protocol like bittorrent but streaming.