These are some ideas I’ve had. If they’re stupid, or great, or based on false premises or inaccurate information, please drop me a note at firstname.lastname@example.org.
Please either build them so I can use them, donate the money to me to build them and open source them under permissive licenses, or tell me of projects or circumstances that fulfill/dissolve the need for which the ideas were created.
Any business ideas published here are free for anyone to use, build, and improve upon, if you believe that they are viable.
This page last modified 13 November 2020.
I miss IRC a lot. I think realtime chat, ideally without eternal and google-indexed logging, with total strangers, has a place on the internet and in our society. Some of my best and oldest friends came from IRC. IRC is in decline because in the IRC paradigm, your session state is tied to your TCP connection, which has been well documented to be a nonstarter in our mobile-first world.
Not everyone wants or knows how to run a znc somewhere, or wants to pay a centralized service like IRCCloud (I’m a subscriber).
An IRC server should decouple the client’s state from the TCP connection, and maintain a queue of messages for the client that can be delivered over several different transports, be it old-style IRC, HTTP (presumably with gRPC or, my favorite, JSON-RPC), or websockets or whatever the fad is. (I vote HTTP.) Such a system might be able to give Slack or Discord a run for its money. With a standard protocol, there could be several competing mobile apps that all speak the new, server-holds-state-for-you IRC protocol.
I was going to write a proof-of-concept in Go, but today I learned that the IRCv3 Working Group is working on such things already, including the other idea I had, which was structured metadata applied to each chat message (which enables the use of cryptographic signatures, among other things). I can’t wait!
Browsers are a lot more powerful now, as evidenced by projects like React. I recently obtained a large-for-me 3840x2160 (4K) HDR display, and, not being much of a TV watcher, I decided it would be nice to use it to display some sort of status board (or pretty pictures) when not being used specifically to display media I am actively looking at. One can only look at TweetDeck for so long.
As Paul Graham says, “You make what you measure.” I want to track 5-10 personal metrics (examples being things like “percentage of monthly income being saved/invested”, “kilograms over goal weight”, “time spent reading trailing 7 days”, “time spent lifting weights trailing 7 days”, et c).
Geckoboard is too expensive (~$150/mo at time of writing) for personal use, and Dashing is somewhat out of date, being a mostly-server-rendered Rails app that I don’t want to deal with the effort of figuring out how to host (plus, it’s unmaintained (there’s a fork called Smashing)). OpenMCT is overkill for a simple personal-use thing, and seems complicated to configure.
An aside: Earthlapse, Naturescapes, and Cityscapes are lovely Apple TV apps from Jetson Creative that are entirely worth their few-bucks-each purchase prices. I run these on my display often, for hours at a time.
I envision a client-side React app, where the configuration for the dashboard is passed in as a URL argument. The browser app downloads the URL given (likely pointing to a Gist or something hosted on Netlify) and interprets the YAML/TOML to instantiate and arrange out a bunch of widget types.
Each widget would have a type and a data source URL specified. The URL simply returns JSON, with a current value, and optional advisory historic data. In the absence of historic data, the client begins keeping a log/cache in the current session, optionally saved to localstorage (or just persisted for the run of the app). This allows data sources to be very simple, and no “dashboard server” is really required, and public data sources can be used.
A single hosted copy (e.g. on Github Pages, or Netlify) could serve the whole world (with individual users specifying their own dashboard config URL as an argument (fetched and parsed by the client app)). Long cache timeouts for the app assets would minimize bandwidth usage.
Then, to add custom widgets only requires standing up a tiny web service somewhere with free/minimal hosting, and adding the data source URL to your config file. You could even specify widget type in the config file by a link to a github, instead of a string name, to fetch new widget components and instantiate them.
Ideally it would look similar to Dashing or Geckoboard, both of which are pretty and simple. An MVP could have simply “singlefigure”, “sparkline”, “gauge”, “clock” widgets to get going. I’m sure people would want more soon, and there are lots of good JS projects for clientside rendering of various types of visual data now.
Separately: is there some easy way to disable CORS/same-origin (an “I know what I’m doing” switch) on a system to allow such a client app (say it were running in a locally-downloaded extension instead of loaded from the web) to reuse your cookies for e.g. Google Analytics or Google Cloud to pull down some live metrics directly for rendering into the client? As far as I understand it, unless the services already support
jsonp, this isn’t likely to work, which may be a nonstarter for entirely-clientside widgets.
Maybe I’ll just give up and run a copy of Smashing. I’d love to see a clientside, static app that can do this in a modular “serverless” way, though.
A web service that gives out a unique ID and never-expiring cookie to each visitor. The home page displays a 6-8 digit Base58 “display identifier” if that session is not claimed/configured.
Users who log in from a desktop or mobile computer with a keyboard can “claim” an identifier by typing it in, then specify a URL or set of URLs to be served to that visitor on every page load.
This way, someone with a keyboardless display system can set the full screen browser’s homepage to “tv.example.com”, and when it boots it will display the “display identifier”. Log in on desktop, specify what URL it should actually go to, and then either it gets served a redirect on future loads, or a 100%
iframe with the specified URL (or a carousel on a configurable timer to display a list of URLs in
iframes). Ability to push a reload from the configuration page to any of the displays (triggering a re-load/re-render of the app it’s running) would also be cool.
Want to change it? You don’t need to touch the display, just log in to the app and change the target URL. If it was a redirect, just power cycle it; if an iframe, a timer on the main page forces a reload on the next check to the server if the url it’s supposed to be isn’t the one it has loaded.
This is basically a clone of what
https://gbrd.tv (itself a redirect to
https://app.geckoboard.com/tv) does, but abstracted up to be used with any set of pages/dashboards/image URLs you wish to display on a keyboardless system. You’d want to use this to control the dashboard URLs that your TV visits, so you never have to hook up a keyboard to the Chromebox running Kiosk again.
Update, 10 Mar 2020: I built it: tvid
Much like Flightradar24 and FlightAware are using a ton of crowdsourced Raspberry Pi data collectors to relay local 1090MHz ADS-B transmissions from planes to a central database and web interface, it would be really cool if there were a network of IP-networked motorized gimbal telescopes opportunistically uploading videos like it seems ExoAnalytic Solutions is doing privately with 300 telescopes. (Check out the video in that link, it’s super cool.)
I recently got a DJI Osmo Pocket which was $349 and has a three-axis motorized gimbal and shoots 4K video. Adding WiFi, compass, GPS (to determine where to point for given satellites), and some cheap telescope optics (most satellites aren’t that far away), and I bet you could build data collector stations like this for $1000-1500 (and falling further over time).
Imagine a crowdsourced data service that shows you a realtime, 24/7 video feed of a given satellite or set of satellites in LEO. That’d be pretty neat, though I am not sure it would be very practically useful (any moreso than decoding and snooping POCSAG is - but the cool factor remains).
A lot of people use their mobile phone’s native camera app as a sort of record-keeping device. It would be nice to be able to file photos into albums that are designated as “file” or “data” or “reference” albums, that cause those photos to be hidden from the default data stream view which is used and optimized interface-wise for more standard “photographs” (e.g. people, scenery, et c). Seeing all my boarding passes, receipts, and such (perhaps I should have scanned them into Evernote Scannable in the first place instead of reaching for the system camera from my lock screen) in my main chronological photos view is not desirable, and it’s a lot of extra work to move them and then delete them.
A fair number of people run single-instance docker servers. I’d love to see a single container that you could run that takes :80 and :443 on the host, provides a simple web interface for management at
admin.<domain>, lets you log in with G Suite auth (or some other provider) and lets you define secret groups, VM groups, VM hostnames/virtual hosts, and has a textarea into which you can paste a docker-compose to bring up new things.
Functionality that it would need to include to be useful: Ophelia-style cron jobs, some basic webhook receiver functionality (e.g. replicate heroku’s “build and deploy on github push”), et c. A really simple web gui that speaks to your local docker socket and manages a 1-host docker install.
It could probably be turned into a closed-source service business by letting users log in, copy a per-user SSH pubkey generated by the service, and installing it into their
/root/.ssh/authorized_keys file and letting the service talk to their docker daemon via ssh.
Once you know the list of which containers are supposed to be running on a machine, setting a timer on the service to do a basic remote health check would be relatively straightforward as well.
An important feature would be load-balancing, and it could probably use Traefik for this internally, writing out an appropriate config. (Traefik solves a lot of these issues itself, but at the expense of needing read/write to a running Docker socket, effectively giving your webserver root on the host.)
If it were open-source, you could run it on a Raspberry Pi, keeping your secrets and everything in one place, parcelling them out to your docker host(s) only as required for the containers/services those specific hosts are running.
I’d love to see a self-hosted oauth solution that supports modern authentication methods like U2F/WebAuthn. Right now it’s great that Google provides excellent mandatory 2FA accounts and auth for free (via Advanced Protection), but it’s still dependent upon a third party. It would be nice to be able to self-host an oauth service, and then your other self-hosted applications could defer to it for authentication (and free 2FA).
I’m thinking something small teams could use via Heroku for <$20 per month, or self-deploy on a self-managed Docker host as above, using an sqlite-backed file.
It’s possible that Authelia is fairly close to what I am describing.
Does self-hosted GitLab support login via a self-hosted OAuth endpoint?
How involved would it be to make a bridge from whatever type of authentication service required by UniFi access point management software to a self-hosted auth database such as this?
This would serve as a small subset of what Auth0 offers, but easily self-hosted (ideally without even requiring an external database).