The Year is 2050. Here’s a Brief History of Photography.

This post is a theorization by David Kai-Piper. It’s just for fun!

We need some disclaimers. Firstly, nothing here is in any way actual, well, not yet anyway. Secondly, it was written for some fun, so take it as a lighthearted look at the inevitable stagnation of the camera world. Lastly, none of this is meant to make sense, and any coincidental similarity with real-world stuff is total, erm, a coincidence.

We all know that, after 2017, camera development massively slowed in terms of hardware. The top camera companies had understood the importance of building in technical obsolescence over technical accomplishments to attract customers. After all, it was cheaper and more effective to create new colors rather than new features. 

After moving from the CCD to the CMOS sensor, most companies just repacked old tech in retro bodies and called it innovation. Some companies like DJI and Apple correctly worked out that it was the software that was important. It seems quite astonishing that, even in 2020, you had to use memory cards as almost no cameras have proper WiFi or built-in storage. Back in 2021, there was no single camera that could take, process, and post an image to Facebook or Instagram. The market showed that people wanted cameras with one-click operation and as many Snapchat filters as possible, but the camera companies didn’t listen. 

Things like built-in beauty editors became the key sales points, not megapixels. In 2022, a report by the White House said that 85.54% of photos were ‘selfies’. When the ‘selfie drone’ craze hit the world in 2025, the streets become full of flying personal drones. As people switched from mobile phones to personal drones, just walking down the street just became a challenge. Oxford Street in London had a drone-drone crash every 60 seconds in 2026. This is what led to the 2027 drone laws, or SkyNet, as we know it. 

Camera Developments

Even up until 2021, camera companies had mostly been making digital versions of old film-based cameras. They kept the same concepts rather than looking at new ways to capture images. This is why the cameras made today still have ISO dials. A reminder of film-based cameras, it’s limiting at best and irrelevant at worst. A few companies did attempt to progress to a seamless ‘gain adjustment dial,’ but that didn’t catch on as people still tried to work out what the ASA equivalent would be for any given exposure.

Sensor tech got a significant boost around 2025 after Apple announced the ‘Automated Photochromic Sync-Closure’ or APS-C Sensor was developed. These sensors had individual photosites that use an AI protocol to close any photosite down that is near overexposing. Meaning, that you could never get an overexposed part in an image. The other advantage is that you can have different exposure times all over the same sensor. This also did away with shutter speed and sync speed problems and the need for things like ND filters or graduated filters. Not only is the shutter global, but individually controlled at the pixel level. Why companies still choose to use ISO dials set to 100, 400, and so on is still a mystery.

The Sigmafilm corporation announced the `Vierazoom` lens in 2049. This was one of the biggest ‘game changers’ as nano-technology moved forwards towards 2034’s unleashing of the organic nanotube. Sigmafilm took advantage of the APS-C sensor (in an alliance with Apple) and could put lenses that could range from 7mm to 300mm in front of each photosite. This meant that not only could every megapixel on the sensor be controlled individually, but every focal length was captured too. Thus, the end of one single lens being used. The depth of field and crop was just selected afterward by the viewer of the image, not the photographer, which was also a big landmark moment. Until this, the photographers set the composition and the framing, as they had to pre-select which lens they used. Interestingly, this is where the term, Post-V, and Pre-V comes from, on Netflix. Pre-V films are the ones you have to use older ‘flat’ screens and can not pick which angle or person you want to follow for a film. Everything is pre-set and you can not engage with the characters either, it’s just a single screen, none interactive story in which you have to watch in a linear timeline.

Around 2023, people started attaching ‘speedlights’ to drones like the DJI Mavic 3. It was a fast and easy way of getting flash units into where they needed to be. This proved too limiting as lifting the lighting modifier needed was impossible. We had to wait until DJI unleashed the ‘mk1’ Orbie in 2040. It took around 5 of these ‘cluster drones’ to lift a ProFoto D12 light, but the updated Lightnet cluster (released in 2041) solved this problem. This was the update that let you link up as many Orbies as you needed. Controlled at one central point to replicate any type of lighting source needed (as the newest Orbie carries a 5000 lumen, 150 CRI RBG and CYM hybrid LED). This is the technique still used today to light everything from Speedball stadiums to everyday street lighting. 

Used correctly, this lighting technique can give an unlimited range of lighting effects. If you want a large light source, they just link up and move away, forming a net-like structure, if you want a hard light source, with a few hand gestures, or using the app, you can move them closer. It is the way they can auto-balance the color of the light that is truly amazing to see. Using the app, you can preset a color scene that Orbs automatically create. These days Hollywood can literally turn night into day by putting enough Orbies in the sky and replicating the daylight, but with any color cast, they want. 

The humble tripod is something that has stood the test of time. A few companies tried to engineer a drone-based shooting platform (think Mavic 6 with a swiss arca mount on the top), but people liked carrying the old ones about, mostly as then they looked like a ‘proper photographer.’ One company, 3 Legged Thing (who are randomly still based out of a chicken shed in the UK and developed the ‘lighter than air’ tripod with helium-filled legs) even developed a tripod that will wirelessly recharge your camera when mounted. It was a result of the new ‘Air-Maglock’ system they had been working on which uses magnets to hold the camera in a levitated state, rather than a physical locking system. The idea was that the camera would always auto-self level. The development of the round sensor put an end to that idea. But, the electromagnetic field generated was found to be charging the cameras too. A happy accident from a chicken shed.

Commercial Photography

It’s not a thing anymore, mostly as the AI and CGI world have intertwined with the world of 3D printing and manufacture (Facebook and Amazon’s new partnership). It is so good at understanding what you need that all adverts are custom designed for the user, showcasing products custom-built to attract them. The age-old idea of creating one set of adverts to appeal to the masses was just not the right approach. As Facebook and Amazon merged databases, it just became easier to target the adverts and design of products and print them to order. 

All the adverts you see are perfectly tailored to the user. It’s a pretty great system. They did it by having AI systems look at every image ever made, cross-reference them with the individual user’s personality and political awareness, the system generates the correct image for any given task. In rare events, new data is needed. The DJI and Apple alliance’s new drone heads out to scan and map the area using the now-standard Reliefography technique that Fujifilm pioneered in 2013, documenting the Van Gogh paintings. This is quite rare, though, since Google maps switched to Reliefography cameras and have mapped 99% of the public space on earth. (Amazon and Google use the next work of Home Hubs to map the insides of buildings)

Digital Register

The selfie used to be seen as a narcissistic way to make you feel good. It was used to show other people how amazing your life was, even if it wasn’t. These days, as we know now, it’s a core pillar in how life works in any major town or city. In fact, a new Gov.com report said that 99.5% of all images taken by humans are now of themselves. 

The selfie is so vital to daily life that they created the ‘Digital Register by merging Facebook and Twitter. Its main goal was to ensure you have every image possible for you at any time. You can simply log into any of the 23.9 billion CCTV cameras and either live stream your day or share images to your social media. You can even log into someone else’s Orbie or selfie stream, for a small fee per day. This was mostly only for fans, though. 

Since the 2049 ‘ International Register Law’ passed, every human is on the ‘Digital Register,’ meaning you and everyone is tracked. It makes it so much easier to tag and rate your interactions with anyone you meet on your daily travels. It’s just that life easier with everything linked and everyone linked. If you want a selfie walking down a road, you can just pull up the camera feed that is tracking you and make that ‘live’ to your fans or store the video for later use. This system also lets you see who is around you and their common friends, and their social rank. Spot a hot guy or girl. You could just track them home or send them a DM as you had all their info via the Register.

It’s this social rank that lets you know which coffee bars you can use and which gyms will accept you. Some shops even give discounts to people with higher social ranks. It’s amazing to think that all this is now available due to the humble camera.