Personal Server
The Personal Server was conceived in 2017, as an alternative to the current paradigm of using centralized walled gardens to host content that we create and own, simply to avoid the complication of running internet infrastructure.
Back in the 1990's, with the help of a very talented network engineer, I helped built and maintain a small network for my employer. We ran our own internal email, calendar and scheduling server, and our own webserver, over an 128K ISDN line, which was state of the art technology at the time, and roughly the equivalent of a 2G cellular connection. Later in my career I designed and serviced both computer networks and telephone systems, and a result of many interactions with network and telecom engineers over the years, began to understand how our communications infrastructure works. AT&T initially built the Long Lines system to be resistant to nuclear blasts and have the ability to route around points of failure, then later the fiber optic network which replaced it was designed to not have a single point of failure. If a significant portion of the country were to go offline, for whatever reason, communications would continue.
If you understand how our infrastructure was initially designed, the trend of outsourcing everything to a handful of "Cloud" providers looks very short sighted. "The Cloud" is simply "someone else's computer", and that computer is typically located a very long way from where you are, with a whole lot of potential points of failure in between. I avoided that trend long enough to see people who were fully invested in the IAAS model, wake up and realize that there are many advantages to handling critical communications infrastructure the old fashioned way.
Why should sending a message to your neightbor, or looking at real estate in your community, require communicating with a mail or web server half way across the country? How many communities today have local emergency management agencies hosting their own website and communications locally, and how many local internet service providers can operate without depending on "Cloud" providers (whether it be a subsciption based "monitoring" or "security" service of some kind, or DNS, or mail...) thousands of miles away? How prepared are we for regional interuptions in our communications?
In my opinion, a combination of C-suite people lacking a basic understanding of how things actually work, combined with slick marketing campaigns that advocate for the outsourcing of critical services to "experts", rather than investing in their own equipment and training their own people, has created a potentially disasterous situation. We have become dependent on a very fragile and tangled web of complexity, and don't even realize that many of our communities simply cannot operate autonomously in the event of outages in other parts of the country, much less closer to home. That is NOT good.
This project is a result of understanding the above. It started while living in Canada and deploying a physical server with a provider in Vancouver, when it became aparent that while paying to host your own server hardware close to your physical location can be significantly faster and less expensive than using IAAS and SAAS hosting providers on the other side of the country, it is not trivial to setup and maintain a server. Someone who doesnt have at least a basic understanding of networks and server administration simply cannot run their own infrastructure. So, people use various combinations of "Cloud" providers for their website, email and other critical services, which inevitably results in a less than optimal user experience.
However, the tools do exist to create a user-friendly system that would allow people to host thier own data, in a location that they choose. The "Personal Server" concept is a combination of low-cost hardware product with an intuitive GUI, and a decentralized IAAS hosting model. Essentially, a user houses at their home or office a small hardware device storing all of their publicly shared data (photos, articles, published papers...) and websites (blog, social feed...), connected to their local computer or network. This device would either be purchased, or alternatively the software could be installed on a user provided computer.
The local device is then synchronized with a public facing server, either operated by themselves, or a hosting provider. The public facing server would store copies of their data, and make it available to the public internet. This decentralized structure represents the way the internet was intended to operate. It would also provide a source of income to hosting partners, who could manage the public facing infrastructure for a fee, users who would have the option of monetizing their own content in the form of memberships or donations, and developers who could develop plugins, providing additional functionality to the built in functionality, that users could optionally purchase.
Users could move between providers without loosing data or having to migrate data between proprietary silos, as their data is primarily stored on their local device. And, users would be able to host content with multiple providers in different countries (site.com, site.eu, site.tw) if they chose. As such, compliance with local laws would happen at the provider level in each region, as opposed to centrally with one giant mega-corp. In addition, exising ISP's and hosting providers could license and deploy the server side software and offer "Personal Server" hosting as one of their offerings, as they already have the infrastructure and expertise required.
And most importantly, we as content creators, retain control and ownership of our intellectual property. By not being dependent on the use of "free" platforms to host our data, we can choose who, if anyone, is allowed to monetize our data, rather than having to implicitly agree to terms that we may not agree with. Mega-platforms would turn into aggregators of content which is available in many places, rather than centralized monopolies.
The Personal Server concept as presented here is a low fidelity model of what is possible, and while there are still some details to clarify and resolve, it or something like it is very much needed. However, it's important to point out that this is not a new idea, in fact much of the inspiration came from Ted Nelson's Project Xanadu. All of the required technology exists thanks to the folks who have spent decades developing FreeBSD and Erlang, and the additional software functionality required could be implemented with relatively minimal development. There is a better way...we simply need to use the tools available, and build it ourselves.