There's a funny change I've noticed with social gatherings, especially of a nerdier bent. It used to be that knowing things was a valuable contribution to a conversation. Let's say someone mentions colourblindness. "Well", you would say, "did you know that the Japanese didn't have a word for green until after the American occupation?". Everyone would say "wow, that's amazing", and then someone else would reply "and did you know that there is a tribe in the Amazon with no words for colour at all!". And so on and so on.
I used to enjoy those types of conversations, until at some point, seemingly overnight, they became incredibly boring. Other people I've talked to have made similar observations, even though none of us have gotten sick of learning new things. I was thinking about why that is, and to me the obvious culprit is the internet. Obviously, the internet has always been a fairly effective mechanism for dispensing facts, but these days between Wikipedia, TED, infographics, feel-smart-about-stuff-every-day blogs and the non-stop bombardment of social media factbites, it's safe to say being able to acquire facts is no longer a going concern.
The impact of this is twofold: firstly, it reduces the value of knowledge as a signal of intelligence. People were once described as well-read, having a breadth of knowledge about interesting topics. But these days anyone can have a breadth of knowledge, you just regurgitate whatever you saw on /r/TodayILearned today. Of course, when everyone else also has access to the same feeds of trivia, and they know how easy it is, it stops seeming impressive.
And the second part is that in point of pure utility, someone telling you a fact is just inefficient now. I could learn ten facts, probably more interesting ones too, in the time it takes someone to struggle through one explanation – and from the comfort of my own home! If my goal is to know things, I'm better off going straight to the source than getting someone else to read the internet to me slowly and with more mistakes.
So, is there even any point in sharing information anymore? Sure, but the value isn't in transmitting the information, it's in choosing what information to transmit and in what context. The flipside of having this endless repository of facts is that it's actually quite hard to tell what you do need to know from what you don't. A computer science course teaches you something that the Wikipedia category can't, which is how to arrange that information relative to all the other information. And a person can select information tailored to your needs, and guide you through the connections between information you have and information you might want.
Someday, perhaps, even that will be done better by computers, and at last there will be no point in sharing anything we've learned. However, even then I think there will be a point to conversation, not about the facts or the knowledge we've acquired, but their consequences. Knowing things allows us to make better decisions and generate new knowledge from what we have already. That's not going to stop being useful even when knowledge is easy and commoditised.
So maybe facts are dead, but I say good riddance. Long after they pass, understanding will live on.
One thing that people seem to get upset about a lot is the watering down of culture. Instead of being real gamers, the kids these days just play Candy Farm on their phones and have never even been to a LAN. Instead of real Chinese food, we get westernised crap like dim sims and chop suey. Ireland has a beautiful and rich history, but all we've taken away from it is Paddy O'Mc'Donergherty's Irish Pub. Outrageous! Where is the respect for Real Culture?
Of course, there is an argument to be made that there's no such thing as real culture, it's subjective and evolves over time and who are we to etc etc. But to me it's obviously true that culture can be deep or shallow, and that you lose something valuable by making it shallow. Turning the entire history of American Indians into facepaint and feathers is undoubtedly losing a lot of meaning! Much like you can have deep or shallow ideas depending on how complex and meaningful they are, I believe you can have deep or shallow versions of culture. After all, isn't culture just a cluster of ideas?
Regardless, although I think culture can be deep, and deep culture is more interesting, I would not agree that shallow culture is necessarily bad. Yes, of course, if you're steeped in the traditional culture and cuisine of India, you're going to think butter chicken is an abomination. But, much like the oversimplified film adaptation of a much-loved book, its shallowness can be a virtue. It would be a mistake to think that people who watched the movie would necessarily have time to read the book, and thus a mistake to think of it as a loss. More people have been exposed to the story, even if it wasn't the best rendition.
That only holds true if you think it's better that someone learn a shallow culture than no culture at all. Maybe it would be better to be absolutist about these things; if they want to learn about our culture, they should learn it properly! And that's an answer that makes a kind of sense, especially if it's more important to you that the deep culture remain intact than that more people are exposed to it. What good is it having people like a watered-down version of your culture?
Well, there is one way that a shallow culture really makes a difference, and that is familiarity. A funny thing happens when we are confronted by people who are very different from us: we get scared. Perhaps it's a historical relic from a more tribal time, where people very different to you were more likely to be dangerous. Whatever the cause, even today, different cultures in close proximity have a difficult time getting along. At the core of that, I believe, is an essential strangeness that comes with an unfamiliar culture.
Think about various western cultural stereotypes: England has tea and royals, America has guns and freedom, France has cheese and wine, Germany has sausages and ruthless efficiency. These shallow impressions, simplistic and sometimes outright inaccurate as they are, are comforting in their own way. I at least have something that feels relatable. Now, tell me, what's an Egyptian stereotype? Or Turkish? Or Yemeni? If you're from the anglosphere, as I am, you probably have no idea. It's not that I have a shallow understanding of these cultures, it's that I have none at all.
So when people from those countries try to integrate, I believe the lack of available shallow culture makes things much more difficult. Without those facile hooks into someone's context, you just feel like you have absolutely nothing in common. And when people have nothing in common on a large enough scale, it's bound to cause conflict. And yet commodifying culture is a thing that many people feel honour-bound to fight against. It seems like such a waste when it could help reduce that tension.
Obviously, there is such a thing as negative stereotyping, and I'm not advocating for more of that. Deliberately misrepresenting someone's culture to make it seem worse is both cruel and dishonest. But not all reductions have to be bad ones, and I think if people were more willing to embrace and participate in simplifications of their own culture, the results would be pretty favourable. And, of course, that simplification doesn't have to replace the real culture, just act as a more accessible starting point. Not everyone will follow that all the way to deep culture, but that's okay too.
I think, in many ways, the casual gaming market has contributed to the normalisation of video games. They used to be a fringe market for basement nerds, but you'd have trouble finding an 8-12 year old today who hasn't played Minecraft. Similarly, it used to be really weird to make friends on the internet and even meet up with them in real life sometimes. These days, socialising is what most people use it for. Does Minecraft compare to Quake? Does Facebook compare to the Newsgroups, MUDs or IRC of old? I feel like they don't, that by being more accessible they have lost something of the essential differentness that made them so interesting in the first place.
But at least it means that I can talk about doing those things without people assuming I come from a different planet. And, who knows, maybe some of them will get curious and start to learn about the deep culture that the shallow culture came from.
I've been thinking recently about boards of directors and how useful they can be. A lot of a board's responsibilities are fairly procedural: budget approval, compensation, mergers & acquisitions, auditing, and so on. However, they also have an important role in guidance, oversight and setting the direction of the organisation. How useful that is depends on how much you trust that board, of course, but a well-functioning board can be an important resource for making and evaluating good long-term decisions.
So if it works for companies, why not for people? You could approach some friends with a demonstrated capacity for good advice and ask them to form your personal board. Unlike an actual board of directors, there'd be no legal liability or boring procedural work. There'd also, hopefully, be no capacity for firing you, though obviously if you don't ever listen to them they might resign. Their role would be purely to offer oversight and advice.
I think, if implemented correctly, this could be a really useful idea. It's important to keep evaluating what your goals are and whether your actions are supporting them, and having a group of people meet regularly for that purpose seems like a great way to do that and stay accountable to it. As well, it'd be useful to have a structured way to approach big decisions or changes, acting as a sounding board (heh) and challenging your ideas.
I'm sure it'd be fairly confronting at first. Most people, me included, aren't used to having their personal decisions and actions scrutinised, or having to justify them. But, why not? If you are confident that you're making those decisions in a way that furthers your goals, then there's nothing to worry about. And, if you're not, maybe it's better to find out and correct it sooner rather than later.
Well, things escalated pretty seriously since last week. I committed to 3 prototypes under 3 hours each. The good news I did 3 prototypes, the bad news is that they may have taken a touch more than I was hoping they would.
This is a little thing I made to mess around with AST tricks. I've always been a little bit annoyed that when people don't design their functions right I have to add the overhead of another function to wrap theirs. Well, not any more! It can do neat things like swap arguments around and replace them with constant values that you supply.
I'm really pleased with this one. It's an iterative scribbling tool based on the way I tend to draw in notebooks when I'm bored. I spent a lot of time messing around with all the little shapes and rules to get a thing that was fun to use. The upshot was that I really enjoyed making it, but I may have gotten a little bit carried away with the time...
This is one I'd been wanting to get started on for a while. It's a system for doing simple, repetitive code exercises for practice or learning. The problem is that there are a few tricky things involved in generating random but plausible code and dealing with all the AST data structures. Luckily, Metaknight had me prepared a little because I'd already had a bit of time messing around with the parser and generator and that crazy AST tomfoolery.
Time: 10 hours.
So, uh, as you can see the amount of time the prototypes are taking seems to be increasing, not decreasing. I'm not convinced that just trying to cut down is going to do it. I need a new approach, one that encourages me to start with as little as I can get away with and build up from there. My last post was about this problem and figuring out what tradeoffs I'd need to make. Based on that I've come up with a new plan that I'm happy with...
One continuous hour! Like a hackathon, or an episode of 24 complete with countdown clock. I think what's missing is an obsessive focus on time. Instead of trying to make something and hope it takes less than the time, I'm going to write code for exactly as long as the clock takes, and whatever's done after the hour is done.
I've been thinking a bit about the different ways I work on ideas, and particularly the prototypes I've been spending a lot of my time on recently. I've been trying to figure out why it's so tricky to keep them under control; they always seem to want to turn into huge projects, or be so small and inconsequential that I lose interest. In fact, I'm beginning to think that this is one of those triangle constraint type situations, where I can only get favourable results on two sides by sacrificing the third. The sides, in this case, are how interesting it is, how complete it is, and how long it takes.
How long an idea takes is probably the easiest one to reason about, and it's the one I've been paying the most attention to. The time isn't just important because I could be spending it on other things, it also has a qualitative impact on how much and when I can do that kind of work. Something like this writing is a low enough impact that I feel comfortable committing to it even when it's not my main focus; I can fit it in around other things. So it'd be nice to get the time for these down to that point. Unfortunately, that means sacrifices in the other two areas.
How interesting the idea is has a lot to do with what it can do for you rather than how useful it is. You get a lot of value from exploring the more out-there ideas, usually because you're learning and discovering new things. That's usually opposed to being practical, because the most practical ideas tend to be incremental refinements of existing ones. That has a very positive impact on time, because you can really optimise your environment for that particular kind of idea. If I was making nothing but mobile games, I could spend a little bit of time improving my toolchain and get a big boost out of it. More interesting ideas take longer because you don't know what you're doing when you start them.
The dimension of completeness is the other side of that. You could just do the bare minimum necessary to learn something or do something with an idea, but in many cases you want to flesh it out. The extreme of this is the product mindset, where the idea is important, sure, but it's the polish on the idea and how well the idea is developed that count much more than the idea itself. It's the difference between a Wright Brothers plane and a 747, or the original internet vs AOL. Of course, completeness takes a lot of time, and even more so with an idea that is more interesting. If the idea has a lot of unexplored and interesting facets, each one of them is another thing that has to be explored and refined if you're trying to make a nicely packaged product out of it.
From this triangular perspective, it's fairly easy to see why the prototypes have been difficult. I'm often trying to take an interesting idea and make something relatively product-ish out of it, and then being surprised that it takes way longer than I expected. Although I've managed to scale down the time taken, that often seems to come at a cost of interestingness; I shoot for less ambitious ideas so that I can still make something out of them in time. But my new conclusion is that I need to leave the interestingness where it is, and start cutting down on completeness.