Talk for Article "Will Alternet replace present day internet?"

Talk about this Article

  1. [ This comment is from a user you have muted ] (show)

    Miguel, can you tell me if anyone from Wikitribune checked that the patent had been submitted before publishing this article? I am not doubting you but I’m asking because I am checking up on the way Wikitribune is verifying articles before publishing. Thanks.

    1. [ This comment is from a user you have muted ] (show)

      You mean the paid staff, because I’m part of Wikitribune myself. They can’t because a patent application remains confidential until its publication, which happens about 18 months after the filing date. We’re talking about mid 2019.

      1. [ This comment is from a user you have muted ] (show)

        I meant has anyone checked it. Did anyone check anything before the story was first published? You say that you are part of Wikitribune. But what does that mean, can you explain? Surely anyone who has a user name and password is part of Wikitribune. Also anyone can contribute a story. Are you saying you are recognized by Wikitribune as a staff member whether paid or not? When I reference your name it just gives me all the articles you have written, it does not give me you status within Wikitribune – or am I missing something. You have obviously contributed several articles, I can see that. I’m just trying to figure out whether Wikitribune really does have processes in place to check stories.

        1. [ This comment is from a user you have muted ] (show)

          John, to be clear, Miguel is a member of the community, as are you. We do have a group of paid staff journalists and editors who maintain the site and provide so far the core of the content.

          Over time we naturally want that to reverse in terms of proportions so that the community is the greatest contributor. In this case there are some things to learn about handling this story which is that it is complicated to have a story written by someone involved in what they are writing about. In general we try to avoid that and this is adding to that believe.

          However, the sort of questions being raised and the detail at which the questions are being asked is exactly how the community should operate: as a fact-checking group and interrogators who help a story develop or be modified over time. It is an important characteristic to the site overall and we are learning as we go: hence “pilot”. In the future it would be better if some of the issues raised by this piece were resolved in DRAFT before it moves to the PUBLISHED state.

          It is also worth saying that what we are finding is that our articles have a longer life time than most sites and can be updated and improved and expanded and if necessary corrected long after they first appear. The TALK about this story is as important as the story itself.

          1. [ This comment is from a user you have muted ] (show)

            Hi Peter, my response to this comment has been moved to “Feedback on everything” following a request from Miguel. Unfortunately it would appear that he did not ask you to also move your comment so my post is kind of just hanging in it’s new location without any context. I would really appreciate it if you would give me some response to what I wrote as I think it is a very important point. Many thanks.

  2. [ This comment is from a user you have muted ] (show)

    Definitely, the shared backbone to avoid all this middleman hegemony from for-profit industries is no doubt supported by many abused and not served by the current internet backbone overseers. You can bet there will be opposition from these groups. I’m all in on distributed shared mesh networks, I’m ready to plug in anytime the physical nodes are ready and viable, still, this will end up in the courts, as suddenly the FCC will find it’s lost dentures to protect the Industrial ISP Complex (IIC), considering the government for sale/hire situation that currently exists in USA politics.

    1. [ This comment is from a user you have muted ] (show)

      I’m an academic, not a subversive militant. I’m pro-government, pro-law and pro-order; I just criticize specific bad acts of governance as is the duty of any citizen. I think it’d be foolish to attempt to launch a project as ambitious as Alternet behind government backs; it is among my plans to talk to legislators all around the world about the necessity to prepare for this technological change (a network like Alternet will replace Internet at some point, be it mine or some other).

      Some people believe institutions like FCC are evil simply because they do what they are supposed to do: protect the existing order. A pension is an acquired right to an ordinary citizen as much as a band allocation permit is an acquired right to a corporation. If it’s an act of injustice to deprive a citizen arbitrarily of a pension, it has to be an injustice to cancel a band allocation permit without proper justification and compensation.

      This issue is much deeper: institutions, like physical persons, also decay and disappear over time and none of them is prepared for that natural process. They just try to perpetuate themselves even when huge changes happen all round. Only when the whole situation becomes so silly it can’t be delayed any further people decides to do something about it. Most politicians, concerned more about the present than the future, often rely on improvisations, rather than carefully weighted plans.

      Political power is just a reflection of your effectiveness. Both governments and corporations are organized institutions; this is, its members surrender some of their individual freedom in favor of a common cause. This explains why both government and corporations are both proactive and effective when it comes to pursuit their interests. In the other hand, ordinary citizens are totally disorganized and only act reactively. No wonder why they stand at the bottom of the political food chain.

  3. [ This comment is from a user you have muted ] (show)

    This is so very non-technical that it’s impossible for a person deeply steeped in networking technology to even figure out what is being proposed. This sounds like a puff-piece which requires a link to a white-paper where the details are discussed. I completely agree with goals of this kind of network, but the engineering for such a thing needs to be done out in the open where people who have made the most of the basic technology and security mistakes can review and advise and contribute. You are touting something that doesn’t exist–I can’t find out more about it. This reduces your credibility, sad to say.

    1. [ This comment is from a user you have muted ] (show)

      I’m not very concerned about my reputation… I can’t lose what I don’t have! But seriously, if this is your way of volunteering for the project, I can gladly take you in. A patent (which the article clearly says was presented just a few days ago) covers only the generic concepts of the operation of something; from there it will be necessary to write a detailed specification, build prototypes, do field testing, etc. This will obviously requiere the expertise of many “persons deeply steeped in networking technology”. As someone suggested, a dedicated website is in the order and that’s exactly what I plan to do. Just be patient.

      1. [ This comment is from a user you have muted ] (show)

        Yes, I think it is, actually, me volunteering, because your philosophical vision is close to mine and you’ve thought about some of the required details. I haven’t been intending to build such a thing myself, but I definitely feel motivated to help bring this into being. Not diving into getting started is a personality thing with me and a subconscious expectation that most people will not be interested in my vision/requirements. I was once personality-typed with “realize concept” as my characteristic goal.

  4. [ This comment is from a user you have muted ] (show)

    As far as I’m aware the sites that are funded by asking for donations are ones like Wikipedia where the hosting costs are small compared to the other costs. So I don’t see this getting rid of such begging bowls.

    What could happen is getting rid of the ISPs, or more realistically reduce ISP use to people who want the bandwidth hungry applications such as films while those who just do email and browse text sites might be able to just share via neighbours.

    You then hit a problem if the Internet continues to have major players of the order of Wikipedia. All the alternet users who are in WiFi range of Wikipedia’s servers will find themselves very busy with traffic from more distant users wanting to transit via them to Wikipedia. Sooner or later all those users will drop from alternet or find some option to limit their openness to alternet so that it only use ten or a hundred times as much of their IT resource as others give to them. The mesh might work in a situation where Internet usage was much more evenly distributed, but that seems unlikely. In the long run Moore’s law might be your saviour, if technology takes us to a point where toys in breakfast cereal packets have the processing power to support multiple film downloads at the same time then this might work, but by that stage people would be offering to sponsor the Internet…

    Your desire for the whole electromagnetic spectrum will upset a lot of people, including users of advertising funded subscription and ISP free television, the emergency services, and users of mobile phones.

    There is some very dodgy stuff on the Internet, child porn, ISIS propaganda and malware amongst other illegal material. Alternet will need as effective a set of mechanisms against this as the Internet not only has now but will have implemented by the time Alternet is ready. This could be a problem, not least because the mesh is designed to bypass censorship of any kind and assumes that all donated nodes are OK being a conduit for anything, including stuff that would be considered grossly illegal if found on their machines.

    1. [ This comment is from a user you have muted ] (show)

      By your answer, I wonder if the article is clear enough. Alternet is an unified but decentralized network; there will not be such a thing as a “Wikipedia server”. Under Alternet Wikipedia is implemented as a service. Both the code and the database are stored in the network itself, distributed among its nodes. Accessing Wikipedia means asking the network to order some random nodes to execute the code (a standardized public registry) and return you the results. That’s why I called them symmetric nodes: because they are undifferentiated (server and client at the same time). Also, Alternet resources don’t stay in the same place for long. As soon as you receive a resource from the network your node becomes a new source fort it. The next time someone request the same resource, they might get it from your node instead the one you got it from. Alternet will be much faster than Internet, not because it represents a quantum leap when it comes to spectral efficiency, but simply because it works like a cache memory that keeps resources close to the nodes using them.

      1. [ This comment is from a user you have muted ] (show)

        Even if Wikipedia also moved to a decentralised system there would still be bottlenecks. At the moment if someone edits a Wikipedia article there is a master copy of that article on Wikipedia’s servers. There may be local caching to reduce bottlenecks on the Internet, but there is still one database where the 5 million edits a month that maintain the English Wikipedia run. The juncture between that database and the Internet is a very busy place. If Alternet were to replace the Internet it would need a way to handle such areas of peak activity, meshing lots of home PC kit would not give you the capacity to handle such peaks.

        1. [ This comment is from a user you have muted ] (show)

          I think I understand what you’re saying. It’s the same old issue with threading in coding. As long as threads don’t depend on each other, they can be parallelized and executed in different cores simultaneously. But when they depend on each other, their execution has to be suspended until they receive what they are expecting from another thread and in the end it’s like one big serial execution at a single core. If you have a slow core, then all becomes slow. I agree that it’s impossible to avoid such bottlenecks, but they happen in Internet as much as Alternet. The advantage of Alternet over Internet is that it’s decentralized from start. Please read your comment again and you’ll notice you’re still carrying a prejudice from Internet: that Wikipedia service will have its own database. Alternet does not understand the concept of server; the records of all services (meaning, sites) are stored in the same database. Yahoo (say) is going to store its records side by side with Wikipedia’s. They are linked with each other, not as part of the same database (they all are part of the same database because there is only one), but as resources flagged as belonging to the same service (think in terms of file extensions like EXE is executable, PDF is a document, etc. ; just change them for service names). In practice, this means that every article in Wikipedia is a single and separate resource from the rest. Trump’s one may be stored in your node, Putin’s in mine and so on. This way, 5 million of edit requests do not all go to the same node, but to many, many nodes. 100 attempts to edit Trump’s article at the same time are definitely going to put stress on the node temporarily hosting it, but I don’t see how this can be avoided at all.

          1. [ This comment is from a user you have muted ] (show)

            The advantage of the internet over Alternet is that sites with heavy traffic can scale up their capacity to handle that traffic. You mentioned Trump as an example, I’ll give you Palin. One evening in 2008 she was named as John McCain’s running mate, the Wikipedia article on her went ballistic with the whole world wanting to know who she was. Editing on the article peaked at 25 edits per minute. We have no way of knowing how many more edits were lost due to edit conflicts. 25 edits per minute is an impressive figure and only possible because in a Wikipedia article each section is separately editable. Wikipedia wasn’t fully able to handle all the number of attempts to edit that article, but 25 edits per minute is fairly impressive. Wikipedia was able to handle the spike in readership.

            That event, like the death of Michael Jackson, was an isolated extreme. But every week there are subjects in the public spot light that get millions of views and flurries of edits. It isn’t always obvious which ones they are in advance. If a decentralised system had time fo adapt then readership could be accomplished by caching multiple copies round the world. But that either results in a lot of cache updating or you you slow the cache updating and sometimes show people out of date versions.

            If you rely on a mesh of donated resources then when spikes occur it will be hard to avoid having spikes sometimes hit the slower nodes.

            1. [ This comment is from a user you have muted ] (show)

              When I was younger I was a game programmer. Needless to say, I never completed any game, but I did learn the secrets of the trade. I’m talking about the days of the first Pentium, when Quake games were hugely popular. Back then there wasn’t hardware acceleration for graphics and everything had to be done by the main CPU. There was only one core and its speed was rated in MHz, not GHz. It was thus imperative to pre-process the map’s geometry in such a way everything which wasn’t visible from a particular position was excluded from rendering. To do it, I learned a technique called octree, which organizes its elements into recursive nodes. Section editing is a variant of this method: to divide something big into smaller parts, so they can be worked on separately. A Wikipedia article is currently constructed as a sequence of sections, but it could be paragraphs or even individual sentences; and this can be determined dynamically in real time. Low demand? Sections. High demand? Sentences. This way, even if several people are working in the same article at the same time their modifications aren’t likely to collide with each other. This is not a problem related to the properties of the network as you say, but to the way web browsers are implemented. As long as they keep using simple text boxes and forms to communicate with web sites, it will be necessary to send large chunks of raw text just to fix a typo. A more sophisticated textbox element could provide structure to the text as a linked list. This structure could be loaded from the server or built automatically as you type (every time you add a period the previous element is split or adjacent elements joined when the period is deleted) and only transmit back the changes done to an specific section rather than the whole text. It’s impossible to synchronize a distributed system fast enough to match a centralized system, bust users would receive a page which is 99.99% identical (only the typo would differ). I call that good enough. If you absolutely requiere full synchronization, you can always wait.

              1. [ This comment is from a user you have muted ] (show)

                What you’re suggesting sounds a lot like IPFS (https://ipfs.io/) but deployed over a mesh network rather than aTCP/IP based network. You should check it out, IPFS has ways of dealing with the bottleneck and update problems you are discussing for implementing services.

                1. [ This comment is from a user you have muted ] (show)

                  Thank you for the heads up. Yes, what Juan (Batiz, Mexican also?) Benet is doing looks very similar to what Alternet attempts to accomplish. I specially liked the public/private dynamic/static resource identification, I hadn’t thought about it. It’s simply brilliant.

                  After careful examination, I begin to see that it’s both more developed (they seem to have a functional network already) and far more restricted in its scope. If I understand it right, IPFS does not allow remote code execution. If this is true, this is at the stage web sites were 20 years ago: folders containing static files, just being pulled from the network itself. I call that a good start (it’s a good part of Alternet’s storage component), but a complete Alternet network needs physical communication and remote code execution.

  5. [ This comment is from a user you have muted ] (show)

    Hello Miguel!

    Fascinating article, I’m excited just reading it!

    Where do you think the Alternet will be received best? As a competitor to hosting services or maybe as nodes in the Tor network?

    This might sound crude but hear me out, the pornography industry pioneered industry standards, from 3G to camcorders, 1080p HD to internet billing systems, porn took these ideas nascent ideas into the mainstream and I wonder, how can that industry make use of Alternet?

  6. [ This comment is from a user you have muted ] (show)

    Might it be a good idea to link to an Alternet website or another source of information in the article somewhere? A quick google brings up a news website.

    There’s also no resource for ‘1A symmetric node’, which could really be helpful.

    Are there any other sources for more concrete technical information about the ideas behind the Alternet? Coming from the point of view of a software developer, some parts felt overly vague or hard to understand.

    I’m definitely intrigued to find out about further details

    1. [ This comment is from a user you have muted ] (show)

      I can understand it looks vague at first sight, but I invite you to give it a closer look. Since you’re a developer, I think it’s easy for me to explain that Internet does not provide services on its own. Every device gets a IP and that’s pretty much it. Even proper assignment of IPs requieres DHCP. Then DHCP provides a DNS. So when you hit “google” in the browser, it isn’t “Internet” but the ISP the one telling you what “google” is. This means internet mostly operates in good faith: on the assumption that ISPs share information about DNS address and such and they don’t mess with it in the process. We all know that this isn’t true, specially in places like China. Alternet is a *single entity*, there are no “versions” of it. When you hit “google” in Alternet, it means exactly the same thing to every node in the network; should a rebel node dare to attempt to change it, it’ll be spotted by the other nodes and ostracized immediately. Like the Blockchain, Altenet solves the problem of providing security to a distributed network but it does it in a different way. It’s a global network that will host by itself every service on Earth. Needless to say, that kind of network doesn’t exist today. This is a news story, it isn’t meant to describe in detail how Alternet manages to do that. That belongs to a separate web site as you suggested. Launching it is in the queue, but please pay attention to the word “independent”. I have no budget, no assistants, nothing. I have to do everything myself and pay for any expenses out of my own pocket. It’s not surprising I progress at crawl rate (I devised the first version of Alternet about 15 years ago; it was meant to track stolen cars and save insurance companies a lot of money).

  7. [ This comment is from a user you have muted ] (show)

    “Traditionally, if you are to deploy an Internet service of any kind, you have to pay big bucks to the ISP or the hosting company just to keep it running (a huge obstacle we’re facing here at WikiTribune”.

    This is not actually true regarding Wikitribune. Hosting costs are pretty inexpensive these days.

    1. [ This comment is from a user you have muted ] (show)

      Oh yes, thank you for calling me on that, Jimmy. Now that I think about it, Wikitribune is a bad example (it’s mostly text based). A better one would be to attempt to launch an alternative to Youtube.

      1. [ This comment is from a user you have muted ] (show)

        Hi Miguel, you gave a detailed reply to my query, which I appreciated. However as you will see my query and the thread is now missing, due to some technical problem with WikiTribune site apparently. Not sure if even this will be posted. Anyway just to say I got your reply and thank you.

  8. [ This comment is from a user you have muted ] (show)

    Hello Miguel, thanks for a fascinating article. From personal experience, I can confirm that defending a user-centric approach is always difficult in the Internet world, where ISPs and name vendors are in a position of unassailable strength. Beyond the theoretical arguments you put forward convincincly for an Alternet, could you please provide links to off-the-shelf solutions, if they exist, e.g. for actual equipment and nodes? Thanks.

  9. [ This comment is from a user you have muted ] (show)

    Miguel, I have published your story after another read and called it an “Essay” which is a category we are using for single point of view pieces from people with specific knowledge on a given subject. Thank you.

  10. [ This comment is from a user you have muted ] (show)

    Hello Charles. Yes, it’s me and I’ve worked on the article to reflect it, as you suggested. I was just brainstorming. I think I’m going to use a word processor the next time and just paste the article when it’s ready. Anyway, I’m done now. Perhaps a bit of final cleanup and polishing by someone else is in the order as I’m not a native English speaker.

    1. [ This comment is from a user you have muted ] (show)

      Hey Miguel – please don’t use a word processor as WordPress does not like the formatting. You can save it as a private draft though and work at it as much as you’d like before submitting it for review. It’s just a hassle if you start somewhere else.

  11. [ This comment is from a user you have muted ] (show)

    Hello Miguel thank you for this. It’s interesting but I found that you were actually the person who put forward the patent proposal and this “philosopher” is actually you. Correct? If you want to submit this as a piece for WikiTribune we have to be very transparent about who the writer is and what their vested interests in it are. I have made some changes but if you are happy to rewrite making your interests in Alternet clear then we may be able to push ahead and publish. Regards Charles

Subscribe to our newsletter

Be the first to collaborate on our developing articles

WikiTribune Open menu Close Search Like Back Next Open menu Close menu Play video RSS Feed Share on Facebook Share on Twitter Share on Reddit Follow us on Instagram Follow us on Youtube Connect with us on Linkedin Connect with us on Discord Email us