If I had to pick any genre that's going to sum up the 21st century, it's going to be cyberpunk.
Not because we'll all be melancholy cyborgs in the future. Every generation we live out the same story--we're not getting any better or any worse, I mean. Material gains ebb and flow, technology improves, but on the inside we're still telling stories about right and wrong and looking for something to worship and struggling against death. Heavy stuff, right?
I'm not picking cyberpunk based on advances in AI or robotics or building infrastructure or pollution or non-pollution or nuclear warheads. I'm not picking based on a grimdark or neonbright look at the future. I'm picking cyberpunk because it's a genre that does its best to blend the past and the future, and it does it remarkably well.
Cyberpunk is an atmosphere: shots of synthwave, a candy-colored spectrum cutting through a stormy night, cityscapes full of crowded loneliness.
Color Theory's 2018 album actually sums it up quite nicely: "The Majesty of Our Broken Past."
See, cyberpunk is a nostalgic look at the future.
What distinguishes human beings from every other living thing in the Milky Way?
Our ability to remember the past and anticipate the future.
We can even feel something about pasts and futures that never were. Being both in and out of time; a Godlike characteristic. Explains why our word is important--what's the point of a vow if you don't remember it, or if you can't even value tomorrow?
Lately, people like to say man is just an animal. An animal has no concept of marriage or investments, hobbies or glory days. When people get sick of money, or the lack of it, they usually start lumping all other obligations and responsibilities with it. That's why marriages and vows and the draft and nostalgia all get a bad rep: they are all tied up with this same thing that makes money such a bother. They matter in the grand long narrative of things. But of course, material gains ebb and flow, but certain things remain. Right? Or do we experience pangs of nostalgia because it feels like good things have an expiration date?
Well, that's up to you. Cyberpunk encompasses any worldview; it's all hypothetical, after all.
What I like about cyberpunk, regardless, is how it can't help but confirm that man is uniquely man. Stories wrapped in cyberpunk cloth are about man's inevitable forwardness; you can't have the cyberpunk genre and make it about neo-future cavemen or the loss of the power grid. Cyberpunk isn't the apocalyptic or survival genre (sure, it can be dystopian, but that's still distinctive). Cyberpunk: this is about how technology is uniquely human, how cities are uniquely human, how loneliness in a crowd is uniquely human. It needs electricity and machine language and things that are much bigger than us in order to point out what makes us, again, uniquely human.
It even questions what makes us human. In this way, its also this past and current century's most spiritual genre. One of the more common threads in cyberpunk literature is, at what point are we no longer human? Is it our bodies--is that all we are? What happens when we break those, augment those, replace kneecaps and hands and skin? What happens to our souls? Do robots dream of electric sheep?
Cyberpunk 2077 is an adaptation of the pen-and-paper Cyberpunk 2020 (you gotta admit, even writing down 2020 these days still feels like role-playing the future rather than living in the actual current year). There's something very evocative about 2020. But, of course, we're living in 2020 and we've still got clean air and bake sales, so Mike Pondsmith's vision is still just that: a role-playing game about the majesty of our broken future.
Well, CD Projekt Red's game fast-forwards us to 2077 and gives us Keanu Reeves.
Now, I'm the kind of person who spent $10 on Black Desert Online mostly so I could mess around with the character creator. So, already, a AAA release with extensive character creation has convinced me to shell out money. But, more than that, I haven't played a good and original RPG with a fully customizable PC since Dragon Age: Inquisition (and "good" is still stretching the word--but I'll save my thoughts on Bioware for a later date).
You can tell from the screenshot below that Cyberpunk 2077 also allows your character (I always think of it as "the character," I never could do the whole "I chose this in Mass Effect" or "When I took down the Reapers I felt...", dunno that just always felt wrong--the PC to me is someone I authored, not me, yeah?) to have a past.
I wrote a few entries back about how little patience I have for the "blank slate" character trend. Is it harder to meaningfully integrate backstories into a big RPG like this? Certainly. But a character without a past, or at least an implied one, doesn't drive nearly as much empathy or momentum.
And here's the other thing. Most custom-PC RPGs are lying to you. RPG. Role Playing Game. Custom Player Character. Oh, except your player character will always use force (and maybe some charm) to smooth things over.
There are very few ways that RPGs can appeal to the mass market or come up with enough content without resorting to "Clear Bad Guy Base" or "Hit 12 Bears Over the Head And Take Their Pelts." Now, before we go further, I'm not one of those anti-violence or games-cause-violence advocates. My very first video game experience was my dad beating the crap out of me on Ready2Rumble. Nobody really wants a bloodless story. Bloodless would imply that nothing is at stake, that there is no force or passion or momentum in your story or your game.
And don't get me wrong. People play these games for the cool factor. It's like being the hero in your own action movie. Thankfully, 2077 has got you covered in way more cool factors than just kill la kill: hacking, racing, boxing, martial arts, stealth, etc., etc.
But here's what I like about Cyberpunk 2077 right off the bat: the game can be completed without taking a life.
That tells me this game is going to have real options, which is what I really care about in a role playing game. I want to have a somewhat-finite story with enough sandbox elements to let me bend and shape the narrative around my character and their experiences, so that when I remember it, I'll remember how me and the game told the story together. And, also, so I have replay value if I ever get around to playing it a second time.
Here's another thing in Cyberpunk 2077: you're allowed to fail quests. Failure is just a different ending, not a fake ending.
Often, there's no incentive to fail in a game. And you might be thinking, well, why should there be an incentive to fail? I think of it from a narrative standpoint: in a story, we are always rooting for the hero to win, but we instinctively know that in order to win he has to fail. Are you interested in a walking simulator of Frodo and Sam heading to Mt. Doom, or did you get caught up in the war, the old loyalties, the deaths of countless good characters? What would Lord of the Rings be without the spiritual death and renewal of Frodo at the end? Not Lord of the Rings, anyway.
Besides, who hasn't rooted for the underdog?
Video games have almost always played out as power fantasies. I'm not saying Cyberpunk 2077 won't be that, and I'm not saying power fantasies are a bad thing. But I love seeing different stories told in my favorite medium. Failure often doesn't get any interactivity, and I'd love to see how failure plays out in 2077.
Also, there's going to be a motorbike from Akira.
But you know, more than anything, I'd be impressed with an RPG which could give us the equivalent of Blade Runner's "Tears in Rain" scene.
I think that's what we really want when we say we want games to be "cinematographic". It's in producing, not just a power fantasy, or a dynamic world, or a custom character, though all of these are exciting and unique to the medium. But the creation of moments is infinitely more difficult, more fragile, more tender.
It's the moments of tenderness in cyberpunk that get me the most: all that soft fleshiness beneath the wires and plating, the soul music still present in synth-wave, the tears in rain.
If You Just Wanted The Gist Of Things, Here It Is
Alright, there's a lot to love in cyberpunk and Cyberpunk 2077, so let's try to sum up everything before this turns into a book:
10 Other Beauties To Release This Year
13 Sentinels: Aegis Rim - Hand-painted nostalgia-laden mech/school simulator
Humankind: 4x strategy with unlimited combination of cultural influences
Yes, Your Grace: a Kickstarter-funded management-RPG about being The King
Eastward: a dual-character action-RPG w/ puzzles, dungeons, quirky locals, and stunning art direction
Crusader Kings III: Medieval Dynasty & Game of Thrones Simulator, Upgraded
Across the Grooves: an interactive graphic novel with an emphasis on music, set in a magic-realism universe
The Last Night: your cyberpunk fix, but in pixels
Rune Factory 5: The Comeback Kid of Cute Fantasy Farming Life Simulators
Haven: A Non-Cheesy Video Game Romance In Space
Persona 5 Royal: Ok, Ok, You Still Can't Play As A Female Joker, But There's A Ton of New Stuff
Spiritfarer: A Light & Lively Take On Being Death's Ferryman
In short, cheers all around for 2020.
Thanks for stopping by. Glad you were here!
In Excelsis Deo.
Hey, glad you're here.
We're almost through with the 5 practicum assignments I attached to the first module in the Game Dev Ultralearning project!
I came across this tutorial to make a simple platformer in Visual Studio. This was the perfect extension project from Projects #1 and #2. Each tutorial, chosen at random, has somehow gone back and filled-in-the-blanks and trouble spots from previous tutorials.
Now, like the screensaver tutorial, I'm still basically rewriting code verbatim. The difference here is that many of the concepts covered in this tutorial felt more like review. Before my formal Ultralearning project, my self-study was sporadic at best. However, I'm glad that I covered the following, even if I wasn't sure how they'd all come together in the end:
A. MonoGame Course
C. C# Fundamentals for Absolute Beginners
It's gotten to that early stage in learning this subject where I'm able to "read" code almost like it were a book. A book written in a foreign language, but a foreign language I'm finally just the teeniest bit comfortable with.
In addition, re-writing all the code from scratch significantly improves literacy and understanding of the code at hand. This tutorial's style of explanation, presenting chunks of code at a time instead of line-by-line AND providing comments side-by-side, was way more effective than previous tutorials that broke everything up to a halting pace.
Here is the code in its entirety:
Pretty much the only thing I added was displaying the score in the MessageBox after the player was done.
I was going to do something more from-scratch, but I only found solutions that either wrote to the console line or to the MessageBox itself, instead of on the UI of the platformer. I KNOW there's a way; in fact, there's an oceanful of tutorials on simple score counters. But between this and other projects I grew stale on this practicum and wanted to spend my creativity on the next assignment. I kept my notes, seen below, but the important bit is when we move on into studying C# a bit more in-depth.
So three down, two to go, with the intention of practicing more from-scratch coding.
The important first step: thinking this through and writing it down on paper before looking up how someone else would do it or simply banging away at the keyboard.
Looking back over the game code, we have the following already set up for us:
Right now, we have no way of displaying this score to the player.
In the screenshot at the top of this blog, I have a label named 'Score' and an empty text box. I haven't hard-coded anything with them yet, so they are just visual displays on the form and nothing more.
My first thought is--how do I code a line that displays the score on the screen and dynamically increases every time the score increases within our block of code above?
At first, I thought I should focus on the display first then the increase. Then I realized, this should be on in the same. As long as I could get it to display "score," then whatever "score" was at the time, that is what would be displayed.
So, how to call the int 'score'? Would a textbox work? It sounds simple, even to me, but for this fact: I don't have any methods or events memorized, and am still fuzzy on the scope of variables.
Now, this exercise brought me into Overstudying Mode. Which is a good thing, especially here at the beginning. I've been meaning to create an [Anki] deck to help me memorize C#'s keywords, methods, and events. This won't substitute actually coding with these concepts, but I figured having a better handle on what the language could do would be helpful and at least save me from googling every little thing.
So, I decided to make an Anki deck before just looking up how to code a simple score display.
Now, I want you to appreciate something:
These right here are ALL of C#'s keywords.
I counted (and confirmed elsewhere to be safe): there's only 104 C# keywords in the language.
Remember how in Blog #8 I quoted Andy Harris as saying programming languages are easier than human languages? I KNOW you had to memorize more than 104 words in your high school Spanish class.
What's even better is that out of these 104 keywords, there are only 10-20 that any one programmer uses frequently.
So let's start with those.
Looking at the code for the platformer game, Visual Studio already highlights any keywords in blue.
C# Literacy Detour
Here are the 8 most frequently used C# keywords:
In Excelsis Deo.
There are three types of relationships an author has with their reader: a relationship built on trust, a relationship built on indulgence, or a relationship which ultimately feels one-sided (a relationship wherein the reader is simply confused about if the author actually cares about them or the story at all).
The quality of relationship is dependent on this skill: knowing how to set up and take down a scene.
When I first started writing, I was made to believe that a paragraph dedicated to a character's description was just a girlish or adolescent urge to play dress-up. I was warned against doing this; to instead let the readers’ imagination take over. I was quoted the worst three words you'll ever hear at your next writer's workshop: "sHow, DON't teLl." Leave things to the reader's imagination--you'll establish trust and rapport that way, they said.
An imagination is a sandbox. I could choose to give my imagination supreme freedom by never picking up a book and simply imagining a "story" from beginning to end, skipping to the parts I like best, swapping in characters from my real life to play their puppet roles in my fantasies.
These are not stories, however; these are daydreams. And they are ultimately dissatisfying, being made for light flights of fancy and not actual food-for-the-soul. Daydreams are also, as a rule, exhausting. They have as much depth as a Calgon commercial; a 20-second blip that fulfills a momentary desire or advertises a perceived need and is lost when the next fleeting desire arises or something of actual substance takes its place. Not having any limits or structure makes daydreams only marginally better than the dreams we have at night, which are rarely cohesive (when they are, we're mostly desperate to know why--did their sudden clarity mean something? Always, always, we are looking for meaning).
Imagination by itself can conduct very little meaning.
One opens a book so they can do more than dream.
Recently, however, it has become very difficult for me to get a first impression on any of the major players, be they World Setting, Atmosphere, Tone, Theme, or Protagonist. Especially in science fiction and fantasy, there's a tendency to whiz-bang a bunch of colorful character names and snappy dialog in the first impression. In literary prose, it's making sure the assonance and alliteration are just right, and the author always appears eager to courageously judge the status quo (just forget about the courage to be normal). In fact, a good bit of this has to do with the lack of courage to be normal. Oh, except when it comes to the protagonist. The protagonist is Joe Schmoe, remember? The protagonist is just a stand-in for the reader.
Well, yes. That's always been the case.
However, nobody's ever mistaken Frodo for Odysseus, or Elizabeth Bennett for Anne Elliot, or Superman for Batman.
We identify with any hero placed in the terrifying role or protagonist, because we know the terrifying role of protagonist. It won't matter if the character is a hero or a heroine, the first-born or the youngest of seven, a redhead or a balding monk--not when it comes to a reader being able to recognize the basic story of humanity in the protagonist.
This does not mean we don't care about the particulars. For example, when we think of Mr. Holmes, we think of two things almost immediately: his deductive reasoning and his costume. Scarcely will you see one without the other; when you don't have the deerstalker, you get an exaggeration of the first trait, which is why BBC's modern Sherlock is even more condescending in his displays of intelligence. Forget the fact that the deerstalker isn't even canonical--it's the fact that, based on the description of Holmes, we can surmise he would be the type of person to wear a deerstalker.
Just like Sherlock, or other everyday people for that matter, a protagonist is made of more than one defining aspect. While it's their actions we'll be most keenly aware of and interested in, readers are inherently less interested in the actions of a blank slate.
Here’s the other thing: we humans like looking at other humans. It’s immensely comforting, when we allow ourselves to be free to do so. And we don’t often get to do it with a level of depth that actually garners trust, curiosity, reciprocity, etc. In a culture obsessed with crime reports and exaggerated definitions of Freudian terms, we treat human personalities like circuses and peepshows: oh poor tragic Diana, oh boorish orangey Trump, oh sick twisted hunky Ted Bundy. No wonder more and more people are depressed and anxious; we can’t just look at a human being anymore and not see some angle of entertainment or neurosis.
Well, fiction has always been another word for balm. One remedy for the inability to see other beings is describing other fictional beings that we can more easily feel through and intuit from. Intuition: that's where reader participation comes in. It’s in making assumptions and developing a feel for a character based on the 20% we’re given; don’t worry, we’ll make up for the remaining 80%.
Character description makes me warm up to that character, thus to that world, and thus to that particular story.
It’s one reason why I have a much easier time following stories from, say, the Victorian and Edwardian eras. They were less cautious about detail, giving us an intimate view of a person’s face, clothing, psyche—because they knew even these verbose descriptions were just a scratch at the surface of a personality.
Case in point, here are two examples from G.K. Chesterton’s detective short story, “The Salad of Colonel Cray”:
The introduction of this character is also helped by Chesterton's fine syntax; the ‘solidified into a figure that was, indeed, rather unusually solid’ at once reveals something about Major Putnam’s physique, while Chesterton’s sense of humor about it adds lightness to his character, and, gives the reader a physical description that is also entertaining to read. Notice how he goes on to describe Putnam’s physical features and even clothing, but not every single thing about him. The main character, the eyes through whom we are seeing Major Putnam, is viewing this man for the first time as well. This being a detective story, any details of personality or quirk or color are also important. But almost every story is a detective story, answering a question with an uncertain outcome: a romance explores the “how” and “if” two people will get together, a drama explores the mystery of the human personality, etc. So this is not to say that this level of detail is only appropriate in a detective story.
Further, I included the short exchange between our protagonist, Father Brown, and the Major, because dialogue too reveals the character, and I was absolutely in love with the line 'with his good-humoured gooseberry eyes.'
The character’s movement (’solidified into something unusually solid’, 'came out of his house in a hurry’) adds to his character description while leaving the story with a sense of momentum, the imperfect details (bald-headed, apoplectic, puzzled, a halo by no means appropriate) combined with more positive details (good-humored, innocent grin) create intrigue and contrast. It already hints at the character’s personality, but that’s all: it hints. We know as much as the main character knows, and that’s important.
Now, on for a more psychological portrait—this one expanding on the main character, Father Brown. In general, the main character will be given a more in-depth, inner-world treatment than other characters. In a romance, where there are technically two main characters, this will also expand to the love interest. But even so, the main protagonist, be it any genre, is the most important to understand: they are at once the lens, the cinematography, the tone; they are what colors the story, no matter how distant the author chooses to keep them from us.
Ah, Chesterton slays me.
He’s using the situation at hand (Father Brown is performing some early morning duties as a priest when he hears what he thinks, but isn’t sure, to be a gunshot) to not only move the story forward, but reveal important bits of information and hint at future characters (the Major named Putnam, the Maltese cook), as well as draw a fascinating portrait of a man who cannot be summed up as simply “a nosy priest” or “a fastidious clergyman” or “an old man.” And Chesterton does all of this in one paragraph.
What's more, Father Brown is the protagonist of a series of short stories. Chesterton probably wrote these stories with the intention that a reader's first introduction to Father Brown could very well be the second or twelfth or thirty-third entry in the series. So he highlights an aspect of Father Brown's character; in fact, summarizes the core of his belief system so succinctly, that both the first-time and the returning reader are rewarded and enlightened and very little ground gets retread.
Notice too how the character description starts with an inner world description and, by the end of the paragraph, has moved this analysis into an action. Pick up any book on the craft of writing and you'll figure out one of the cardinal rules is that a scene starts in one mood and ends in another; we go from inaction to action, from triumph to loss, from question to answer. These scenes can be as long as a chapter or as short as sentence; or somewhere in-between, like this here paragraph.
Illustrating Brown's character also justifies the action he takes next. Father Brown would never stray from his duties on a lark; but his is a "free mind" (the reader is invited to ponder over what this means) and has already started working on the curiosity at hand; so the two men combine, and Father Brown logically goes to investigate because it is true to both halves of himself 1) gunshots being serious, it is his serious duty as a priest to investigate, and 2) being a free thinker, his curiosity will not rest till he gets to the bottom of this mystery
. Chesterton's "detour" is at once reflective and active.
Chesterton could have simply written: 'Father Brown went in at the garden gate, making for the front door.'
Perhaps that's a choice Hemingway would have made. Perhaps that's a choice you would have made.
But Chesterton is telling us this in two voices. His, and the protagonist's. Father Brown really is two men: he's revealed in Father Brown's voice, and he's revealed in Chesterton's voice. Like a matryoshka, our characters and our stories hide themselves in layers, informed by the author, who in turn informs the character's personality, which informs their actions, which informs the story, which informs the theme, which informs something too deep for words, which are woven back into the cyclical nature of the story by audience interpretation, who interact with all levels of this story, sometimes erroneously putting too much emphasis on any one of these factors. Stories, the good ones, are as much full-bodied orchestras as they are living tapestries: why unravel the author's sexuality or the historic context or the the theme or the style, as though one string were what made up a whole tapestry or one note which chiefly made up a symphony?
If stories are genuine, if they are true, they will, going back to that orchestra analogy, possess this multi-tonal quality: the voice of the character and the author may not always be as one, but they will resonate to the same key. The characters will not be author stand-ins (here is another cardinal sin of bad character descriptions, usually taking the face of a Mary Sue/Gary Stu), but the story will not be bereft of an author either. You can't have one without the other.
The examples above aren't meant be taken as the Golden Standard. This is the Golden Standard for G.K. Chesterton. Part of the hard work of the author is finding, maintaining, and polishing their own Golden Standard.
Before we move on to the opposite of the Golden Standard, let's look at a more spare way of introducing characters. Having dropped Hemingway's name often enough in this entry already, let's use him as our example. Is he the antithesis of what we've been exploring here, or does the argument hold water no matter the authorial voice?
Before, we said that the skill needed to establish a quality relationship with the reader is the skill of being able to set up and take down a scene.
That sounds more like something for plays than for ho-hum prose, but it's all connected.
Look at the examples of stagecraft below. All are from different productions of the same play, Tennessee Williams'
The Glass Menagerie:
Our narrator Tom introducing us to his past, and thus to the play's real narrative. In the first, his homelessness is apparent and his recollection of his family home is piecemeal and dreamy. In the second, Tom's a more clean-cut outsider and his family's apartment is more fully-furbished flashback than it is a mindscape.
Different interpretations of the main playing space--the entire narrative takes place in the family living room. It ranges from a realistic set-up, to a slightly realistic set-up dominated by colors representing the character's emotions, to almost purely abstract, taking the title of the play to an extreme.
The pivotal scene. Laura and her would-be suitor Jim have a heart-to-heart about Laura's glass menagerie. In the first, all is warm and romantic. In the second, the contrast of red and blue suggest all sorts of complicated feelings--and the picture in the background is obviously not real, but a symbol of--what? In the last, it's a 50/50 mix of this approach. A fully realized set-up, warm in the front, with the blue uncertainty of the world dominating the space behind them and weighing on their shoulders, threatening to overtake the fragile peace. Appropriately, this is the scene where Laura's dreams are shattered, and no matter how the scenes are dressed, every iteration of this play will have two things: warm candlelight and a cold collection of glass.
While plays have the different advantage of being visual media, the thing about setting up and taking down scenes applies across the board. The reason I bring them up is that, in this case, the stagecraft is an appropriate analogy to the authorial voice. As long as there is a solid narrative and a few anchors, and as long as the scenes are set up and then taken down, what is conveyed in-between (by pale strokes or bold, fully-realized details) is all a matter of set design. Important set design, to be sure--each of the productions above, for example, bring out different aspects of Williams' story in different dosages. Some productions will be more palatable to some than to others, but even if you don't like the play itself, you can't deny that the story, scene by scene, simply works.
Let's talk about anchors.
If I were taking a pleasure cruise around the Adriatic Sea, you know what I would vastly appreciate? An anchor. Yes, I want to go and sail and see, but if I don't have the security of an anchor, I'll always be vaguely anxious on the journey. The anchor allows the journey. Stories need anchors, too.
In the case of The Glass Menagerie, the very title is the anchor and must, by necessity, be featured as a prop in the play. But there are others, too, notably Laura's candles. No matter how abstract, these two anchors will be present in every iteration of Williams' play.
In the Father Brown stories, Father Brown himself is the anchor. But so is the landscape, and so is the tone.
If prose and protagonist are stripped to bare essentials, a reader still needs an anchor.
Now, here's the part where we finally get to Hemingway.
Here are two things that a more Spartan voice like Hemingway does:
Here's an example of him establishing all of these things in one paragraph from his short story, "Indian Camp":
The smoke. That's what I remember after this paragraph. I can smell it. It anchors me to the spot. This woman is more fully realized, at the moment, than the main characters (the young boy Nick, and the relatives we know so little about, the boy's father and his Uncle George). All I can think of is that the men went out of range to smoke, not to keep the smoke out of range, but to keep out of range of the noise. Her husband obviously would have joined them, if he hadn't cut his foot.
Hemingway doesn't tell us how to feel about this, but he does, at the same time. "The room smelled very bad."
Illumination and Silence:
Words Never Spoken Are The Ones Worth Hearing
There's a fine line between well-decorated and kitschy.
If you over-explain a thing, or oversell a thing, it's always rings false. It doesn't matter how down-to-earth a person claims to be (in fact, the more down-to-earth, the better a person is at seeing through someone's bravado), a human being is an intuitive creature. First impressions are disproportionately powerful for a reason--once someone draws a conclusion about a person or a character or a work in general, the impression tends to stick. If there's an impression of salesman slickness, it'll be hard to wash out.
For example, there are two types of writing, one in ads and one in fiction, that are really one and the same and always make me cringe: the mouth-watering sell.
Those descriptions on menus that try to make you salivate over the watercress and gouda burger or, alternatively, the blackened chicken Cajun gumbo? I always feel a little less dignified after having read them, like I was being seduced by twelve different greasy contenders. I mean, I know what I want. I want a Reuben, thanks, you don't have to sell me a sexier Reuben, I like Reuben just the way he is.
It's worse with character introductions that come from the same vein: you know, the broody guy with rock-hard abs or the ingenue with pliant breasts and supple...something or another. This pretty much screams--have a visceral reaction! Purely physical descriptions that are obviously aimed at one thing--titillation. The reader as a Peeping Tom.
Don't get me wrong. This isn't a high-brow crack at the poor plebeian public; The Washington Post already has a corner on that market. Sexuality, the senses, blood and poetry and guts and good looks: all of that belongs in fiction, because all of that is human.
But selling me a character like you'd sell me a hamburger simply feels undignified to me. Dignity isn't about class, it's about common ground. Don't treat me like a salivating customer, treat me like a partner in this telling. The Oral Tradition of Yesteryear didn't rely on one storyteller alone, but the input of those around the fire. In that case, let my imagination take over. Give me the particulars and I'll give you my particulars. Tell me our man's got a broken nose and I'll imagine which way it's tipped; tell me our lady's got dark hair and I'll give her the reddish sheen in the sun.
Since we've touched on framing scenes already, and touched on the topic of titillation, I want to point out that there's still absolutely nothing wrong with innuendo.
Innuendo, like authorial voice, is all about the quality of light: the colors chosen, the amount of, the intensity of. Innuendo as illumination. Innuendo is also, like authorial voice, as much about the silence kept.
In fact, innuendo is most powerful because of how its framed (the scene is set up and taken down, even if it's nothing but a fade-to-black) and because the words not spoken are the ones worth hearing.
If you've ever listened to Bruce Springsteen's "I'm On Fire," you've probably listened to at least fifteen times in a row. Here's a song at once intensely private, like it says it all, and yet it's all too short. Or is it? The very length of the song supports its message: its the question that's immortal, the answer that's finite. It's not to say that stories with happy and/or definitive endings are sub-par, but that the details are not what is ultimately satisfying. The innuendo of "I'm On Fire" is in the same category as the ending of all classic stories, "And They Lived Happily Ever After." How? Doing what? Does it matter?
The innuendo is even more potent in the official music video:
Nothing happens, and that's what's so moving.
More than that, though. It's about wanting to connect with somebody. And you feel that, even without seeing the Woman's face or knowing the character's names.
A story works if it connects with the reader. Even if its a repulsive story; it the story talks of inhumanity but treats the reader as human, it's alright. Even if it's a sad story, and especially if its a joyful story (which can't help but being equal parts sadness and happiness).
Floating Heads and Purple Prose
Of course a reader doesn’t want or need to know everything the character is feeling, or thinking, or wearing. It’s descriptions that attempt to box characters in to a clothing style or cliche that are the real reason for the overbearing rule of “Don’t Reveal the Protagonist’s [Hair Color, Etc.] or Else.”
The problem with sparse or hit-the-ground-running writing arises from characters taking actions before the author has fully realized at least one other aspect of their story: the atmosphere, or the world setting, or the character's relationship to the world or the other characters.
But regardless of style, character descriptions should serve to place the character and the reader in the world, not floating around with a bunch of talking heads.
At the very least, the Floating Head Syndrome is a sure way to get me to not care.
You know, the Floating Head Syndrome, a cold open with two characters talking and dropping names and events like we're supposed to care. This works on an economic level, I mean, I know about these types of stories because they've gotten published. They were accepted out of a sea of submissions and people wanted to pay for them. They work for somebody. There's less concern here with what works at the moment, however, and what works in the grand scheme of things. We can spend our whole life studying the tenets of story, but they're older than we are and have lifetimes to teach us and we can only give it one lifetime. Sounds lofty? Of course. But stories are universal--I mean that literally. This Universe is a Story; science can't explain why it doesn't have an ending. Stories are as common as air and earth, so I'm not trying to speak in an Ivory Tower kind of way about it. It's as much a craft as carpentry or auto mechanics or the Japanese tea ceremony. It takes lofty expectations and elbow grease.
The opposite problem of the Floating Head Syndrome is the Purple Prose Problem.
Now that we've borrowed from plays and music, I say we agree to be democratic beggars and also borrow from film. If innuendo is an example of good cinematography, then Purple Prose is an example of bad cinematography.
As usual, Wikipedia sums it up pretty darn well:
"In literary criticism, purple prose is prose text that is so extravagant, ornate, or flowery as to break the flow and draw excessive attention to itself. Purple prose is characterized by the excessive use of adjectives, adverbs, and metaphors. When it is limited to certain passages, they may be termed purple patches or purple passages, standing out from the rest of the work.... Purple prose is criticized for desaturating the meaning in an author's text by overusing melodramatic and fanciful descriptions."
I like that--the desaturating. Or, oversaturating, as in the example above. In this case, the subject matter and the composition is just fine--but all the "extra touches" simply drown the piece.
In general, people are more afraid of writing purple prose than they are of writing what's known as beige prose or blue prose. Beige prose is taking Spartan prose to an extreme; going for boring even when your characters find themselves in dazzling situations. Blue prose is what most screenplays are troubled with nowadays: the character with the vocabulary of a 14-year-old who just discovered the thrill of four-lettered words.
The result of purple, beige, and blue prose is all the same: an overwhelming blandness.
And none of that garners the relationship we started talking about way at the beginning: trust between the author and the reader.
In the end, though, I discovered that the words themselves are just anchors for the real story: the Story that is told in sighs too deep for words.
In Excelsis Deo.
Making A Screen Saver in Visual Studio #C
Last week, I "completed" this screensaver tutorial. Can't say I completely understand everything that was going on (plus I couldn't get preview mode to work), BUT, it still demystified the following things for me:
Gotta admit, it was a thrill going back over some of the code he had written and suddenly going "A-ha!"
I still couldn't have come up with the code itself, but my ability to read and comprehend is getting better, bit by bit.
Riddle Me This
We talked about programming sub-skills a few blogs back, and we came to a conclusion: puzzles. Mental gymnastics, learning how to learn, puzzles, riddles--yes, good. Good for budding programmers.
So I started doing Sudoku puzzles. But going beyond what I used to do, actually looking into two things:
1) Mental techniques for improving my form/efficiency (I turn off the clock on my Sudoku app, but I started caring about the time it took me to complete a puzzle, insomuch as I wanted to improve each game and not just coast through each game for entertainment purposes only)
2) Focus (being able to focus on just this puzzle from start to finish, no interruptions)
And there's a third thing hidden in there:
3) Problem solving
Mental techniques, focus, and problem solving: all of them sub-skills for programming.
I couldn't believe it took me an hour to finish my first Sudoku puzzle. It was only on moderate difficulty.
But the second time, it only took me twenty minutes.
And then I couldn't believe the improvement. I even noticed that I was feeling much less foggy within a week of starting the exercise.
The idea of completing Sudoku puzzles actually came from V. Anton Spraul's book, Think Like A Programmer. In it, he also mentions a gem by the name of Sam Loyd.
I mean, just check out this guy's dedicated web site.
Sam Loyd was an early 20th century puzzle pioneer, and his puzzles are just beautiful to look at. I went ahead and ordered a few of his collected riddles and puzzles, figuring they'd make great keepsakes and handy tools for sharpening my rusty problem-solving skills.
So, I guess the lesson from this practicum is: strengthen your mind, not just through programming. Especially when actually being able to understand programming, let alone being able to program yourself, can be slow goings.
Bonus: Things I Tried To Make Learning More Automatic and Organized
1. Defining My Environment
The same computer I use to write short stories, browse the internet, check emails, and play video games is the same computer I use to work on my projects, take tutorials, write my blog, and watch lessons.
Oh, and it's also in the room where I read and sleep.
Needless to say, some days the temptation to noise and distraction is overwhelming.
So I tried two things.
I made sure all my clothes remained in my closet, at least 80% of the time.
I lit a candle every time I had trouble focusing on just coding.
Keeping my room an average level of clean kept me from anxiously nitpicking it or anxiously ignoring its faults. And lighting that candle served as a signal to my brain that it was time for one specific thing.
2. Getting Up At The Same Time Every Day
I really struggled with brain fog between Blog #5 and Blog #6. So I did what I knew I should have been doing all along: regulated my sleep schedule.
This is easier said than done. I should say I wake up at relatively the same time every day. And that this one is a work in progress.
But I decided that this task was important, even if it felt only tangentially connected to my more passionate goal, which was to wrap my head around the concepts we've been exploring. I realized, however, that I was chasing stimulation instead of results.
And it's hard to admit, sometimes, that results are a product of time and, not, strictly, productivity, or what passes for productivity.
So, I've made consistency a priority, even if its slow goings.
3. 30g of Protein within the first 30 Minutes of the Day
Sometimes I really don't want to do this one (and, um, sometimes I just don't do it), but I know its results first-hand. I started this habit way back in high school and the results spoke for themselves: I lost weight, gained energy, and had a habit I could rely on.
So I retuned this habit recently, taking Timothy Ferriss' advice to eat 30g of protein within the first 30 minutes of waking up. This is one of Ferriss' MEDs (minimum efficient dosages), or the least you can do for the most results. In this case, eating 30g of protein within the first 30 minutes of waking up is a two-fold no-brainer:
A. It regulates fat like nothing else. Without changing anything else in their diet or exercise routine, obese practitioners who put this habit into daily usage saw a monthly increase in weight loss (Ferriss' own dad went from losing 5 pounds a month to 18+ pounds a month from this ONE thing alone--he didn't regulate any other part of his diet and he didn't start hitting the gym).
B. It regulates mood. Some days I'd be fine skipping breakfast--could even feel heroic. But the compound interest would result in a few inefficient, foggy days about a week later--it almost always works like that, doesn't it? The results of our decisions can feel so delayed it's hard to say what caused the sudden lag.
4. Putting A Win At The Beginning of the Week
Ray Bradbury once told struggling writers to aim at writing 52 short stories a year, one for every week. I mean, you can't write 52 bad short stories in a row.
Realizing that the first week of January had yet to pass, I thought--why not? Some would be prompts, some would be flash fiction, some would be just for me, some would definitely be aimed at contests and publishing.
Then I made one more caveat: I'd make sure I got the story done at the beginning of the week. Monday or Tuesday, using Joyce Carol Oates' advice of just writing the rough draft in one complete gulp. "You can edit for weeks afterword." Well, hopefully not, but as weird as it sounds:
We're going for quantity over quality this time around.
And I'm putting this goal at the beginning of the week so I have psychological goodness running through the rest of my week. When I'm struggling with making progress or staying focused or skipping breakfast or some other misstep, I can think--"Yeah, but I finished that thing."
And I finished that thing today, y'all.
It feels good.
Bonus Bonus: A Good Read
Great interview from game designer Chris Avellone.
In Excelsis Deo.
This week I completed Computer Science 101: Master the Theory Behind Programming. Here's my notes!
As a disclaimer, I don't claim to be an expert that can teach you these things, which is why I make sure to link plenty of articles from people who are experts. But using the Feynman Technique, I'll fancy myself a teacher trying to explain these topics to a student. Your patience is appreciated.
Now in the grand scheme of things, this is furiously brief. Check out the course in the link if you're so inclined, or pick up a book (Computational Fairy Tales, The CS Detective, Computer Science Made Easy--just to name a few), or find some free YouTube vids to go further in depth!
Here's our Table O' Contents:
I. The Binary System: How Do It Know?
II. Time Complexity: What We Talk About When We Talk About Runtime
III. Or What's An Algorithm For?
IV. Math: A Way of Thinking About...Everything
V. Who's Afraid of the Big-O Notation?
VI. The ABCs of How To Begin Thinking Like A Programmer
VII. Congratulations, You Made It To The Summary!
But for now, let's hit the ground running.
I. The Binary System: How Do It Know?
This is how computers think:
This is just a short blurb about the binary number system. It's not essential to know this inside-out, but for the purposes of understanding the logic of computers, it's pretty helpful.
"At the fundamental level, the computer only knows zero and one."
Let's break that down.
So, our computer runs on--what? Electricity.
Go over to your nearest light switch. Or just look at it, that's fine.
We switch it on. To a computer, that's the equivalent of a "1."
We switch it off. To a computer, that's the equivalent of a "0."
So, way down deep, underneath all those fancy user interface windows, all that complicated looking code, all the complicated code underneath that code, way way down deep, the computer is really just looking for a series of on and off switches. It's just looking at what's a zero and what's a one.
And that's basically all you need to know here. When we're programming, we're just using human-made languages that work with these on-off switches (also known as transistors). Or you can think of it as passing along the torch to a team of interpreters. Either way, we don't need to concern ourselves with the nitty-gritty, but it's good to know a little of what's going on underneath the hood.
So now it's time to switch back to a human mode of thinking. Even though it might not sound very human at the first read. In fact, it's going to sound wonderfully science-fictiony.
We're going to talk about time complexity.
II. Time Complexity:
What We Talk About When We Talk About Runtime
What is time complexity?
Well, first of all, what is run-time?
We could break down a software program into two stages, just to simplify things: it compiles and it runs.
First, our program compiles. It's loading all the systems in place. You'll especially see this on complex computer programs, such as highly modular computer games, where you not only implement the "vanilla" systems but can also add fan-created modifications to the queue (Skyrim and the Sims come to mind).
In the second half, our program runs. Our program's runtime is simply what happens between when a program opens and when it closes.
But we've all encountered long load screens and lag before. The underlying code of a program needs to be organized in such a way that it's not tasking the computer to run too many things at once and take longer than it needs to.
There's a way computer scientists can talk to each other and analyze each other's programs; how they can figure out how long and complex a program really is; and how to make it more efficient.
It's called time complexity.
And time complexity is actually about algorithms.
An algorithm is a set of rules a computer follows in order to make calculations and solve problems. When a program is running, it is essentially running through multiple algorithms that control the logic of the program.
Time complexity "is a standard way of analyzing and comparing different algorithms. It allows computer scientists to figure out how to improve what's already out there" (Kurt Anderson's words, not mine).
So, when we talk about runtime, we talk about time complexity, and when we talk about time complexity, we're using the standard of computer scientists around the globe: an analyzing algorithm known as Big O Notation.
But first, let's open up algorithms a little bit more. After all, they're the bread and butter of computer science (and thus the bread and butter of game development and all other related fields).
III. Or What's An Algorithm For?
To reiterate, an algorithm is a set of rules a computer follows in order to make calculations and solve problems. It's like a recipe, giving a computer a series of steps to follow to complete a computational task. Wikipedia makes it even more succinct:
logic + control = algorithm
And guess what?
We encounter algorithms all day long.
You know plenty of them already.
Here's some examples you picked up in school:
Here's some examples from everyday life:
Here's some examples of technology using algorithms in the background:
Here's some examples from video games:
And here's the one we're actually going to focus on. We've mentioned it before: Big-O Notation.
But before we talked about Big-O Notation, let's talk about its predecessor, n-Notation. Using caterpillars.
n-Notation looks like this:
and reads like this:
n is a function of n
and if you translated that further, that means:
output is a function of input
Here's an illustration of that, using something less abstract than "n":
When a caterpillar enters a cocoon state, it follows the logic of a caterpillar (in this case, it's "cocoon function"), and the logical output is a moth. However, this moth isn’t a completely new, unrelated factor. The moth is a natural part and extension of the caterpillar. And it’s controlled. Remember, algorithms = logic + control. Even though caterpillars munch on leaves all day long, no amount of logic will turn it into a leaf. In the end, the output must be a function of the caterpillar itself. You could go even further and say this particular caterpillar couldn’t even become a butterfly if it wanted to—because this is a specific caterpillar that will turn into a specific moth. Another caterpillar might become the Monarch butterfly, but that would be a different function. Different caterpillar, different function.
Even though we’re dealing with abstract things like ‘n’ or big numbers or what-have-you, this is the basic logic.
So, caterpillar is a function of caterpillar.
So, n is a function of n.
Here's Kurt Anderson's bold note: "n-notation is NOT an analysis of how long an algorithm will take to run, but an analysis of how the algorithm will scale with more and more input data."
And guess what? We're still not moving on to Big-O Notation.
Nope, in the spirit of Mr. Miyagi, we're going to build up the suspense--but it's all much needed stuff, don't worry.
Continue not to worry, even as I unceremoniously drop our next topic:
IV. Math: A Way of Thinking About...Everything
Yes, computer science is built on math. More accurately, it takes mathematical concepts to explain the reasoning behind computers and their programs.
I'll say this: learning this kind of math has been much more fun and a lot less airy than high school math. We're not talking about train time tables or abstract text book questions--this has real-world applications, and, even if it takes a couple passes, I found it...amusing.
In fact, when you decide to study math on its own, for its own sake, for your sake (and not your academic system's requirements), it's a whole new world. No less difficult, but now it's a welcome challenge rather than a chore. That's been my experience.
Math, and computer science, are really about thinking.
Computational reasoning wasn't invented by computers; it was invented by other human beings.
Computational reasoning is logical reasoning.
And we often don't give math credit for its sexier attributes: its explained hypothetical futures, ended wars, illustrated the shape of space, helped construct magnificent cathedrals, enabled rockets, and organized the entire sum of human knowledge.
And today we only have to cover logarithmic functions.
I'll link more in-depth explanations here and here (this last link goes ahead and explain Big O Notation as well), but I'll quickly try to explain what a logarithmic function is and why it's important by illustrating its opposite:
So, if you've graduated high school, you have at least a cursory knowledge of what exponential is.
4^n can be read as 4 to the power of n.
Which means if n = 2, then 4^2 means 4*4, and therefore 4^2 = 16.
And while not the same equation, just to get a picture, on a graph of x = n^3, it looks like this:
That's an exponential function. We're increasing exponentially.
As stated before, an exponential function is the opposite of a logarithmic function.
Let's do the logarithmic version of the exponential function, x = 4^2. It would look like this:
log₄16 = ?
log(4) of 16 = ?
We can convert this to the exponential:
4^? = 16
4 to the power of what is equal to 16?
Well, 4 to the power of 2!
So another way to refer to logarithmic functions is to call them inverse functions. They are the inverse of exponential functions. Just look at the graphed comparisons between these two types:
They are inverse. They are opposites.
V. Who's Afraid of the Big-O Notation?
Now, the reason why n-Notation and logarithmic functions matter is because they lay the groundwork for our first important algorithm: Big-O Notation.
Q. What does it do?
I like Jeremy Kubica's definition from his book, Computational Fairytales:
"Big-O notation is a method of specifying the worst-case performance of an algorithm as the size of the problem grows. Big-O notation is important for understanding how algorithms scale and for comparing the relative performance of two algorithms."
Q. Why is it called "Big-O Notation", and how is it related to n-Notation?
Look at the chart below:
Big-O Notation is referring to the second notation listed there, the bigger O. (Yes, we can ignore the Greek letters for now).
This chart also notes the relationship that Big-O has to n-Notation. The n-Notation is included right there in the parentheses. Big-O is a function of N. In this case, it's a function that tells us how many operations this algorithm will need to operate to complete the problem at hand.
Q. What does it look like?
The above is a screen capture from this website, which is worth overstudying. Scrolling down the page, you're going to see all sorts of terms, like array and heapsort and insertion, that we're going to see over and over again in our study of computer science and game development.
But we don't need to eat the whole elephant at once.
The last thing we need to think about in regards to Big-O Notation is perhaps the most important.
Q. Why should I care?
We've already established that computer scientists need a way of analyzing an algorithm's efficiency. If we throw, say, 100 elements at it, is it going to take 100 steps for that algorithm to run?
Well, if n = n, then yes. But the larger n is, the less desirable that particular algorithm becomes.
We've already established that Big-O Notation is a way of measuring 'n.' N, in this case, is just our stand-in variable for however many steps a particular algorithm will go through. N steps for N elements. Written in Big-O-ese, that's O(n).
There are two ways to look at Big-O notation: from a mathematical POV and from a computer science POV. We've already tapped the surface of what the mathematical background is (with sizeable gaps, I know, but check out this video here for a better take on the subject), but all we need to know is why computer scientists care about Big-O Notation:
We use it to judge the worst possible outcome so we can devise the best possible solution.
If it's good enough for the pros, it's good enough for us.
VI. The ABCs of How To Begin Thinking Like A Programmer
Andy Harris gave a great speech a few years ago about how to think like a programmer. Or, in his case, how to teach a beginner about programming. The speech is an hour long, so here's the breakdown:
A. Coding is not about the programming language.
B. You can learn one or two dozen languages, but the same basic concepts apply across the board.
C. It's that way of thinking about and combining these concepts to solve problems that constitutes coding.
D. According to Harris, there are only about eight concepts in all of programming:
E. Declaring a variable with appropriate name, type, and value.
F. This is actually an algorithm, too. Remember, an algorithm is a recipe. Here's what the above is saying to the computer: "Create a variable called Name, of type Type, that starts with the value InitialValue."
G. Output. Telling the user stuff. print ("Hello World") and the computer will print "Hello World" to the console.
H. Input. Asking the user for input before you give them output.
I. Convert to integer. Take something that's not an integer (A.K.A. any number positive, negative, or zero) and make it into an integer. Perhaps taking a float number (3.14) and converting it.
J. For loop. (This and while loop is best saved for an in-depth discussion at a later date).
K. While loop. (Ditto).
L. Debugging. A.K.A. Why is this broken?
M. So the #1 skill we are after is so obvious it hurts: problem solving.
N. In Harris' own words, "The secret isn't code, it's algorithms and data."
O. Further, "If you're lost in coding, it probably means you shouldn't be coding yet." (I.E., learn how to think first)
P. Comments aren't there to explain the code to other programmers; code is there to explain the comments.
Q. We're all beginners.
R. Best Practice: Write an algorithm in plain English before you start coding.
S. Failure is WONDERFUL.
T. Begin debugging now.
U. Of course, the best way to debug is to not have bugs.
V. "Bad implementation can be Googled, bad algorithms cannot."
W. "Don't start with the solution; start by truly understanding the problem."
X. "Nothing is messier than game code...you don't write it to be maintained, you write it to be fast."
Y. It will all happen in good time.
Z. And finally, his recommendation for what to learn after you're comfortable thinking computationally:
VII. Congratulations, You Made It To The Summary!
Happy New Year Everyone!
In Excelsis Deo.
K.W. writes novels, short stories, the occasional ode, game scripts, and (with actual evidence!), this here blog.