I’ve discovered a lot of research via two minute papers on YT. Entertaining and easy to understand.
Personally, I much prefer “software research” from engineers working in the industry. I’m sceptical of software research being done at universities.
[delayed]
I feel much of the knowledge and experience in the industry is simply lost because it isn't widely documented and studied. There needs to be detailed histories of major software development projects from the industry, in book form for people to learn from, in the same way as histories of military campaigns and railway projects.
It not widely done, and we end up with mere "Software Archeology", where we have artefacts like code, but the entire human process of why and how it reached that form left unknown.
People (and companies) need to be incentived some how to write and share.
This is actually one of the things I struggle with the most. Even small scale apps are mysterious to me, I have no idea how to build, deploy and maintain an app correctly.
For context, I work for a startup as solo dev and I manage 5 projects ranging from mobile to fullstack apps.
I am completely overwhelmed because there basically does not exist any clear answer to this. Some go all in on cloud or managed infra but I am not rich so I'd highly prefer cheap deployment/hosting. Next issue that idk how to fix is scaling the app. I find myself being stuck while building a ton as well, because there are WAY too many attack vectors in web development idk how to properly protect from.
Doesn't help not having any kind of mentor either. Thus far the apps I've built run fine considering I only have like 30-40 users max. But my boss wants to scale and I think itll doom the projects.
I'd wish there were actual standards making web safe without requiring third party infra for auth for example. It should be way easier for people to build web apps in a secure way.
It got to a point of me not wanting to build web stuff anymore because rather than building features I have to think about attack vectors, scaling issues, dependency issues, legacy code. It makes me regret joining the industry tbh.
You can load test! Make another copy of the app in a copy of your prod environment and then call it until it breaks!
Also look at the 12 factor app [0] and the DORA capabilities [1].
0 - https://12factor.net/ 1 - https://dora.dev/capabilities/
Scientists come from all facets of life you know? Some might've even been at FAANG at some point :)
There is an enormous gulf between research in general and the people who should be reading it from a professional point of view. Science communication is really broken and what makes the trade press or press generally is largely about whether a papers authors manage to write a good press release and effectively writes an article themselves.
We need more New Scientist type magazine like things that do decent round ups of scientific findings for various fields that do a good job of shuffling through the thousands of papers a month and finding the highest impact papers. The pipeline from research to use in professions can drastically be improved. At the moment you end up having a hosepipe of abstracts and its a lot of time to review that daily.
Do you have any good recommendations about recent software development research?
Till today I still share with my coworkers this 15yo article from Microsoft Research:
https://www.microsoft.com/en-us/research/blog/exploding-soft...
Thanks, read the paper about testing the mythical man month theory.
Seems the conclusions are: fewer people is better; only one "organisation" or group should contribute to the code; ownership should be at the lowest level possible, not a high up manager; and knowledge retention is important, effectively manage people leaving and make sure the design rational is well documented
Science Research doesn't happen for its own sake. Every effort needs to be a part of the pipeline of demand and supply. Otherwise it's just a tune that you sing in the shower.
> Every effort needs to be a part of the pipeline of demand and supply
It's almost unthinkable the amount of technology and innovations we would have never gotten if this was actually true in practice. So many inventions happened because two people happen to be in the same place for no particular reason, or someone noticed something strange/interesting and started going down the rabbit-hole for curiosities sake, with demand/supply having absolutely zero bearing on that.
I got to be honest, it's slightly strange to see something like that stated here out of all places, where most of us dive into rabbit-holes for fun and no profit, all the time, and supply/demand is probably the least interesting part of the puzzle when it comes to understanding and solving problems.
A lot of people have already mentioned cases where this is neither true nor desirable e.g. high-energy and condensed matter physics, astrophysics, any branch of pure mathematics etc.
But, more importantly, who dictates what needs to happen. If you so desire, you should absolutely sing a tune in the shower, write a poem for yourself, calculate an effect and throw the piece of paper away, write code and delete it. The satisfaction is in exercising your creative urges, learning a new skill, exploring your curiosity even if no one else sees it or uses it.
I have had the privilege of working with some of the best physicists on the planet. Every single one of them has exposed only part of their work to the world. The rest might not be remarkable in terms of novelty but was crucial to them. They had to do it irrespective of "impact" or "importance". The dead-ends weren't dead to them.
Philosophically, as far as I know, we all get one shot at living. If I can help it, I am going to only spend a fraction of my time choosing to be "part of the pipeline of demand and supply". The rest will be wanderings.
The Fourier transform existed for the sake of existing for ~200 years before it turned out to be useful for building the entirety of our communications infrastructure on top of.
I agree 100% in spirit. Electrical transmission lines were not understood when Fourier did his work. Maxwell wasn't even born yet. And the math ultimately unleashed by Fourier transforms goes way beyond applications.
In cold hard dates, though, Fourier was already using his series to solve heat transfer problems in 1822.
I don't agree with the bizarre idea that every bit of research should have a clear utility. I'm just being careful about dates. And I think FTs kind of were invented with a view towards solving differential equations for physics. Just not electrical ones.
This is only partly true. MRI technology came out of people hunting for aliens in space. The path science and discovery take are rarely as linear as the funders would like them to be.
Indeed, not to mention the fundamental science you do now may be a product later and sometimes 50 years later.
Transistor was 1947 but a lot of the basic science was from 1890's - 1920's.
Still transistors right - what did they ever do for us? (apologies to the monty python team)
There are always edge cases. But the bulk follows the gravity flow. Even poetry, these days, should find a buyer.
If you've ever wondered why progress in fundamental physics seems to have slowed down; look no further!
Maybe you are wrong about what is the cause, what is the effect? You describe how we fund most research, so of course this is what we get.
> Even poetry, these days, should find a buyer.
Why?
Sometimes edge cases is all there is.
You are describing applied research. But fundamental research seeks to expand knowledge itself, and unsurprisingly delivers a lot of unplanned value.
Applied research consists of taking theoretical findings and applying them to a specific application. As such, applied research requires fundamental research.
The whole purpose of life is singing tunes in the shower, arguably.