• 0 Posts
  • 125 Comments
Joined 1 year ago
cake
Cake day: July 18th, 2023

help-circle

  • Theoretical biologist here. I consider viruses to define the lower edge of what I’d consider “alive.” I similarly consider prions to be “not alive,” but to define a position towards the upper limit of complex, self-reproducing chemistry. There’s some research going on here to better understand how replication reactions (maybe encased in a lipid bubble to keep the reaction free from the environment) may lead to increasing complexity and proto-cells. That’s not what prions are, but the idea is that a property like replication is necessary but not sufficient and to build from what we know regarding the environment and possible chemicals.

    I consider a virus to be alive because they rise to the level of complexity and adaptive dynamics I feel should be associated with living systems. I’ll paint with a broad brush here, but they have genes, a division between genotype and phenotype, the populations evolve as part of an ecosystem with all of the associated dynamics of adaptation and speciation, and they have relatively complex structures consisting of multiple distinct elements. “Alive,” to me, shouldn’t be approached as a binary concept - I’m not sure what it conceptually adds to the discussion. Instead, I think it should be approached as a gradient of properties any one of which may be more or less present. I feel the same about intelligence, theory of mind, and animal communication.

    The thing to remember when thinking about questions like this is that when science (or history or literature…) is taught as a beginner’s subject (primary and secondary school), it’s often approached in a highly simplified manner - simplified to the point of inaccuracy sometimes. Many instructors will take the approach of having students memorize lists for regurgitation on exams - the seven properties of life, a gene is a length of dna that encodes for a protein, the definition of a species, and so on. I don’t really like that approach, and to be honest I was never any good at it myself.


  • I’m going to hazard a guess it’s a combination of falling budget and an over reliance on autocorrect. If it’s like other industries, they’re trying to get more articles out with fewer people.

    I know that I often have an atrocious number of typos - but some are entirely the fault of autocorrect either changing a correct word to something else or correcting a typo to a word that makes no sense in the context of the sentence. I’m hoping that the next generation will improve this.

    If anything a now - not typo at least indicates that it was written by a human. LLM errors generally don’t involve that sort of thing.



  • No, computer engineering tends to focus more on hardware. When I was doing that kind of thing in college, computer engineering did things like chip design and logic boards and so on. I had courses on DSP and VLSI, multiple assembly languages, RISC vs CISC systems, and so on. In my university, it was considered a subspecializqtion of electrical engineering, with the first two years of undergraduate study being identical.

    When I switched over to CS, I was doing things like numerical analysis and software systems architecture.

    Both majors used math, but CE (as an EE major) required students to go through (iirc) calculus 5, and I think that CS majors could stop at calc 3 but would end up having to do different kinds of math after that.


  • I’m a theoretical biologist.

    The best book I read on multilevel selection theory was actually written by a professor of philosophy. The author broke down the individual concepts, as they do, so that anyone reading it knew exactly what each technical term referred to. Biology is my favorite subject because there’s so much that we’re still figuring out and it’s just ridiculously complex.

    I might have had a similar hot take as an undergrad when everyone has an ego based on their major - and I was even a computer engineering student for awhile, and engineers tend to be even worse at that sort of thing.









  • There’s a tsunami of layoffs in the gaming community, and in tech in general. A lot of the time, it’s entirely unjustified as the positions being laid off are often quickly put on the market again, and it’s usually not the top engineering talent (the most expensive) because they’re harder to replace. It’s often focused on lower tier jobs/support teams, and the cost of re-filling a position (sourcing, interviewing, hiring, training) would far offset any kind of salary reduction. It’s like the tech management version of a Michael Scott vasectomy.

    I wish someone would compile a list of companies acting in extraordinary bad faith so I could consider that when making purchase decisions.




  • I don’t see a future utopia (or non-utopian) society a thousand years from now feeling at all compelled by a legal agreement between two independent parties a millennium ago. The law firms that set up the contracts will be long gone, the legal framework that established them will have evolved if not been replaced completely. I mean, compare where we are now with where “we” were in 1024, and then think about how much more quickly things change today. Any money is going to be more meaningless than 11th century money, but with no collector’s value since they’re just numbers in a database that probably won’t even exist in a thousand years.

    I think we can legitimately view having your body/head frozen in the hopes of being woken up as a tech version of the Catholic last rites.


  • Cool! There’s probably a small factor differentiating the two, but it’s not that noticeable.

    I did a research project looking at (iirc) kinase cascades, in which we were using a molecule-by-molecule simulation to look at cascading signals in hypothetical signaling networks, and varied the levels of phosphorylation required for activation required at each tier, and showed how the different topologies/rules governed the relationship between input and output signals, and their relationship to noise tolerance (since chemical networks can be quite noisy). It was very abstract in that we weren’t reconstructing known networks, but rather using sandbox physics to explore the idea.