Revising the creative process

The first step to resolving a conceptual ( combining one or more things together, in order to make something new ) problem with relevancy begins with understanding that conceptual thinking is not opportunistic. It is episodic.

There is no top-down; left or right convention. We deal with an existing set of realities and by doing so, we’re dealing with concepts that have already been resolved by other expressions of creativity—considered perfect in most regards, solely because of prevalence. They are correct and work because they’ve been experienced and utilized, establishing form & function creating a unique, and relative perception or utility for other people.

To suggest there is a formula or process, suggests everything else in the world is innately wrong. The information we’ve received through observation, statistics, and opinions must be considered factual in order to conceptualize something with relevancy. If not, principally speaking, this will end the conceptual process before it begins and eliminate relevancy.

The creative process does not see the tree before the forest. It will see the forest and systematically work back in relation to the whole on a broader experiential plain. To scrutinize the height of the tree and it’s foliage is destructive and impulsive. As it is, the forest should be considered perfect. It has no immediate relative meaning to us, therefore there are no problems or logistical mistakes. And if you assume this is a top-down process then I urge the objective eye to look closer. Once you’ve broken the tree down, you’ll begin to see the leaves, then the veins, then the texture of the leaves and so on until you are so completely removed from the first and most obvious plain ( common relevance ), it would be near impossible to conceptualize a relatable experience to those of us still waiting by the tree line. The objective/solution is too far away, and unrelated to the common surveyors. The common surveyor is now lost in the forest and the only thing they care about is; getting out.

Once interested in the forest, the prospect may focus in on a path. Offering an experience related to a new a concept is equivocal to offering choice. Choice denotes value for the surveyor thereafter. Math whizes refer to this as infinite regression. If your idea or concept isn’t based on existing constructs, you’re likely to loose your surveyors’ attention or interest.

Again, to suggest there is a formula is to suggest everything else is wrong. We know that mother nature is not wrong. A problem may exist relative to us, but that is not something we should consider wrong; that it is not an observable truth or fact. A concept that is built on an existing set of concepts must be treated as fact. And we must assume that initial parameters of these facts are not going to change. They are perfect truths. And in agreeing to this, we must realize that if the parameters do change within the facts, the concept must be reevaluated entirely. A truth or fact is personal context, and if context fails, there is no perceivable relevancy. Information cannot be introduced into the concept without expecting the relevancy of the concept to change as well.

Information arranged and organized through symbols, colors, pictures and words creates content. Content presented in relationship either paralleling, or contrast existing content creates context. Context creates relevancy.

Removing the term creativity from this explanation and subsequently replacing it with solution, we might explain this process as; an existing system with a series of subsystems, that; based on their individual relationships, continue to create an additional set of subsystems with each reaction thereafter.

The value of creating

Emulation, as a pseudo-nurological phenomenon has propelled learning farther than most give it credit for. I have no scientific proof, but I’m going to pontificate with a capital ‘p’ for the sake of your attention and juxtapose the word over more prescriptive words for the sake of my point —
We learn through emulation — copying the footsteps for a dance teach us to learn that dance; taking a photograph is an emulation of a past experience; swimming with flippers represents our ability to emulate fish fins after first observing fish — this is actually how I think most people learn, generally speaking. We’re visual and tactile, auditorial and lastly, conceptual.

As an aspiring comic book artist, I copied my favorite page spreads. Eventually it was time to stop copying others’ work and start creating from memory — from “scratch,” as I would say back then.
But I learned through emulation, an uninhibited ability to express myself through another person’s visual language, coping pictures in order to see what other artists saw. Eventually, I matured, developing my own visual language, refining and honing it in a way that represented my specific ability. Writers similarly read their favorite books and authors.

If I had not had the freedom to copy, to learn through emulation, I would not have developed a personal technique; skill; tacit behaviors that define my work against others’ work, creating both an identity and a perceived value.

“I quote others only in order the better to express myself”
– Michel de Montaigne

In my opinion we’ve eclipsed sagacious and empirical learning — in the past, suppressed through religion, but near impossible to remove from the arts — we’ve usurped empirical learning with simulated and emulated emotions. We’re no longer influenced and raised by “the people in our neighborhood,” if even in our own country. We’re not experiencing much of anything, face-to-face, for extended duration; quality time with each other often seems none-existent.

Our ability to reason a stranger’s body language as friendly or aggressive, has diminished. Has body-language diminished while our attentions are focused elsewhere? Our ability to recognize the look of attraction vs curiosity is blunted like dolts staring at each other. And so, our verbal skills suffer because we’ve forgotten, and in some cases, neglected to share the basics; stand straight, eyes forward, speak clearly and confidently.

Everything is entertainment and content designed to project the world in an exaggerated fashion — exciting — unreal — not ever going to happen in our lifetime — it’s unhealthy. Our memories suffer with nothing significant to mark their day — their week — our lives. Content has supplanted memory, emotion and experience for many.

More so than any other time in history, people are xeroxing each others’ knowledge and emotions. Mostly emotions/actions they’ve learned from actors. Actors paid to dramatize life. We don’t talk to each other in one-liners and expect a laugh-track with each punchline, and consequently, we’re not nearly as frenetic as actors appear onscreen.

But that’s what we’re developing — people are xeroxing un-informed emotions, learned from actors and serialized content from streaming media — for hours straight — days in a row — desensitized of empathic senses, and resolute in learning passively through confabulatory and idiosyncratic identities. We’ve become mostly irreverent, indifferent, unaware of how others’ are actually living.

And when I say others, I mean our true next-door neighbors, colleagues, friends — maybe our own partners or children. Life doesn’t have a pacing — it just happens. Everyone needs a moment to remember, or improvise their lines — we should embrace the moments that mark our daily life more significantly. Content doesn’t constitute experience. Watching life doesn’t equate to living life.

Our voices are being drown out by dramatic presentation and representation. Our imaginations stifled by 24 images per second. And our ability to shape the word around us, a distant memory illuminated by HD quality pictures.

Until — ?

To not look at another drawing and draw from memory took practice, propensity and finally, prosperity. It took a shitload of months and years. And people crapped on my portfolio ALL the time. Someone eventually gave me a desk —

I found my “face,” as they used to say when referring to a new visual “style”. Vigorous encouragement from friends who genuinely took interest in my “thing” kept me focused. Their interest taught me to respect my craft. And mentors taught me to understand it through impositions.

Technique is a perfect mistake. Failure is okay. It’s the only way to discover your technique or signature — ?

Mass exploitation and self-exploitation is not art, or an art-form. The commercial promise collapsed years ago under the moniker of “user-generated-content”. Our culture is a simulation solely through a historical lens of mother natures’ force squeezing us into individual diamonds; polished, and strong — instead now, soft clay, constantly effected by forces around us, shaped too easily by emulated emotions, too —

We have peaked media.
We prosper more through participation and emulation than by passively watching. Our lives are marked more lucidly, making our creativity more original and more informed with each demonstration.

Can religion coexist with science?

It’s contingent on accepting fact over fiction. We have a large part of society that truly believes they do not need to be concerned with the world as it is now—today—here—now.

We have another part of society that believes their religion and god(s) afford them a higher role in society, over those of us that do not subscribe to their religions.

We have another part of society that uses their religion and god(s) as political platform for constituting social order through fear and oppression.

We have another part of society that thinks, since their religion and god(s) are benign (from a contemporary standpoint, due to years of amendments, removing all traces of historical violence ), they can supersede fact with fiction by remaining passive.

We have another part of society that uses religion and god(s) for capital gain through commercialization of hope and fear.

As our knowledge grows, and we get better at sharing our knowledge with each other, in order to refute and disprove facts and fiction, we should discover something more useful—ideally—

Why do we avoid blaming—the blame—religion, politricks

It is a manifestation of people unable to deal with both real events requiring action, and real emotions that deal with reactions.

It is, in my opinion, a mass-psychosis induced by a generation* brought up entirely on media screens, unfettered by the real—unabated processing and absorption of information—as spectators only. Life-experience is traded out for video snippets, sound-bites, and hashtags. None of which provide insight; wisdom; sagacity.

Conviviality vs. truth; because on TV, everyone has a clever quip—an obvious insight—an amicable solution—nobody gets hurt when the studio lights go dark.

Conversely, there are many people working for the media that incite riot and innuendo as professional pretenders. They work for agendas, maximizing our attention on narrowed and limited ideas.

We’ve unsuccessfully moved from being noise junkies to noise producers—trading our value centers for contrived idealisms. We are coming dangerously close to losing our ability to provide well-thought and informed ideas—

Let alone, long-term memory retention of the things we’ve created. Their respective failures and successes are instantly magnified and then consequently reduced to nothing, tossed out, and overwritten. As a rapid-prototyping creative democracy approaches, we forget before we’ve learned. The most precious of commodities; exchanging, implementation, and advancement of ideas through fair and ubiquitous streams of information can potentially hurt us when we can’t objectively review and participate with the abstract world.

As the micro-centralized, neo-feeling, thought-idealists collect, where or how, do we continue to provide unlimited access and information through a synchronized operating system (the internet) without censoring people and creating the same depreciated socio-economical walls that exist offline?

Youth is conflated with vulnerability, amplified by predators still politricking the system. But this time around, we’ve given our attention to social networks that censor our thoughts like authoritarian aristocrats who have no intention of working with us, but re-education is fine.

If we can’t learn to cooperate, we’ll find ourselves under the thumb of those who are more comfortable and happy to control your life for you. If we continue to mis-categorize our language and emotions—if we can’t address issues as they are, we can’t provide tangible solutions.

We must learn to contend with it everyday. Most importantly, we must all be more informed about the ideas we’re exposed too. The You; We; Me and I in media will always lead to Us.

*Young and old

Does constraint positively influence creativity?

It’s a myth. Or, perhaps, a misnomer at worse. Creativity through conceptual conflict is more appropriate. Does X fit with A, if not, how can I make it fit—*

With this said, the only constraint or conflict that exists unilaterally for everyone is; people who approve or disapprove of the creative expression. Individuals’ who posit that they’re experts in knowing what creativity is and how it applies to the problem—ugh. You know the types. They see something new, but can’t process its relative association. So they find deprecated statistics, conflate correlation from the old thing with the new thing, and then systematically attenuate the new to be more like the old (they’re being creative, too!). Some do it deliberately—other’s, a result of their education and personal experience.

We all demonstrate creative thinking, daily, hourly, minute-by-minute. The commercial and industrial world loves to tout this as a selling point to garner new business from prospects. In a sense, we’ve allowed pundits and self-appointed arbiters to propagate their narrowed field of perception amongst us all.

Creativity is the combining of two or more elements in order to synthesize something new.

We do not judge this new thing, whether it’s a good or bad expression of creativity, but we accept that this expression has resulted in a familiar likeness with this new thing. By accepting it, the technically capable begin accentuating the expression into something emotionally and mentally approachable.

With this said; many people and companies presume that there is a process; something repeatable; learnable on a mass scale—predicated on the industrial model, to consistently produce and express creativity with identical success each time.

Unfortunately “creative process**” is an oxymoron. Creativity is idiosyncratic, rarely pragmatic, and often a result of informed ignorance (another oxymoron). Technique is making a perfect mistake—an unintended discovery, while having honed the technical skills to refine and shape the mistake (or discovery) into something familiar. Technique is what defines the expression of creativity into something useful; acceptable; beautiful, ugly, useless, abstract—

We should also be clear and isolate creativity from art, or art-forms. You do not need to be an artist to express creativity. But artists (in all fields/art-forms) are generally more receptive to receiving external input and thinking, translating it through an art-form, yielding a form of creative expression (writing, music, design, art, sculpting, architecture, fashion, industrial design). Show me anything in the modern world that wasn’t created by someone (nature excluded)—regardless of their job title or function, artistic ability or skill. The world around us has been shaped and refined through creative-human-expression, for better or worse.

Implying constraints or limitations provoke creative thinking is a self-induced allusion and another creative attempt at expressing—expression. It generally depends on the individual and their ability to receive; interpret; translate; and express their creativity within the external world.

How badly do you want to express something, and how hard are you willing to work for others to understand it?

Many people incubate, many rapidly iterate, many work methodically—linearly. But under no circumstances does everyone work in the same exact way. That would be the antithesis of creativity. And limit the diversity of thought and expressions of culture and humanity as a whole. We’d all be a bunch of self-replicating, unaware bots.

Many people can and do work in teams, but rarely do you ever see one head, and many wrists. If you do, it’s because many of the concepts and creative forms have already been decided or expressed in such a way that other’s are simply copying a defined technique, not engaging in a creative process.

*This implies that creativity is a result of a specific problem or task, but creativity is not dependent on the need to purposefully solve or complete a task

**I forget where, or from who this term originated—

New UI trends

I’ve been utilizing a lot of these types of models within the past 6 months, as I’m sure a lot of other designers are doing. Here a few of the models that have been introduced to the masses with little resistance and a low learning curve.

Dashboards : Top tiered navigational or user initiated functions located in a fixed or anchored position within the environment (memory retention)

Less iconography : Text based navigation elements while icons are reserved for second or third tier subsets of functions located within the main content (progression and regression within the environment forces elements to become smaller or larger based on the depth of the interactive experience)

Interstitials and pagination of content : While subtle elements were used for kinetics within interactive experiences that introduce new information are still popular, we’re seeing more content presented in a linear format that is animated in an obvious and recognizable way. This has led to the more open-space design because more content can be served up with less clutter for the user

Device independence : A lot of designs are very neutral in design, development and function because companies have to rebuild and distribute apps across many different platforms. So you want a design that doesn’t require a lot of development, production, and redesign for multiple devices. You want consistency and familiarity across every device

Bandwidth : Clean and subtle designs don’t take up a lot of bandwidth

Pale Blue Dot

Carl Sagan:

“Look again at that dot. That’s here. That’s home. That’s us. On it everyone you love, everyone you know, everyone you ever heard of, every human being who ever was, lived out their lives. The aggregate of our joy and suffering, thousands of confident religions, ideologies, and economic doctrines, every hunter and forager, every hero and coward, every creator and destroyer of civilization, every king and peasant, every young couple in love, every mother and father, hopeful child, inventor and explorer, every teacher of morals, every corrupt politician, every ‘superstar,’ every ‘supreme leader,’ every saint and sinner in the history of our species lived there — on a mote of dust suspended in a sunbeam.

The Earth is a very small stage in a vast cosmic arena. Think of the rivers of blood spilled by all those generals and emperors so that, in glory and triumph, they could become the momentary masters of a fraction of a dot. Think of the endless cruelties visited by the inhabitants of one corner of this pixel on the scarcely distinguishable inhabitants of some other corner, how frequent their misunderstandings, how eager they are to kill one another, how fervent their hatreds.

Our posturings, our imagined self-importance, the delusion that we have some privileged position in the Universe, are challenged by this point of pale light. Our planet is a lonely speck in the great enveloping cosmic dark. In our obscurity, in all this vastness, there is no hint that help will come from elsewhere to save us from ourselves.

The Earth is the only world known so far to harbor life. There is nowhere else, at least in the near future, to which our species could migrate. Visit, yes. Settle, not yet. Like it or not, for the moment the Earth is where we make our stand.

It has been said that astronomy is a humbling and character-building experience. There is perhaps no better demonstration of the folly of human conceits than this distant image of our tiny world. To me, it underscores our responsibility to deal more kindly with one another, and to preserve and cherish the pale blue dot, the only home we’ve ever known.”

Originality is done

I hear this a lot, expressed in many different ways; It’s been done before, I’ve seen this but a little different, This reminds me of that one thing, So and so is already doing something similar.

This is a lonely sandbox to play in, isn’t it? I think this is oxymoronic. Ideas beget ideas. Perhaps the real point of interest is, we’re exposed to more ideas and communication than ever before in history. Not that lots of ideas and notions haven’t been thought of in previous years.

Today we have direct access to a lot of content. And while ideas borrow from ideas, there will always be originality. So it may appear that everything has been done before (the lonely sandbox), but questioning the propensity of originality is analogous to suggesting that a wade of clay will always end up the same shaped ceramic.

Turn off all the content long enough to think about what it is you’re thinking about. Spend time on iterative design, thinking and exploration. It’s okay to be a little like something else, too. Technique is making a perfect mistake. It has nothing to do with talent.

U and I

User Interface: Communicate, Direct, Alert
User Experience: Express, Emote, Experience

These two terms mean more than their current contextual usages imply. These words relate to everything we see, experience, buy, drive–things we use. We’re always excited and inspired by the new, but find it callous, cold, unusable without a familiarity from the old.

The icon on an iPhone has a traditional phone receiver as an indicator to make a call from a device that looks nothing like its predecessor. However, it does do the same thing a standard landline phone does. Their basic purpose and function remain the same. Their ergonomic functions are identical, too. They look nothing alike. Intentional or not, we’ve slowly accepted progressions like this throughout our lives. In more areas of our life aside from a phone.

There are several layers to user interface design and user experience. There will always be a compromise in designing for the few versus the many. Especially considering the rate of adaption and learning required for adaptive or just generally new things.

A year or two ago, I posted a definition for Skeuomorph. Over the years, through graphic design, writing, user interface design, art–anything that requires an expression of thought to relate to a narrowed perspective, I’ve learned that there are no rules. Only familiarity with what people know to be a norm and what people are willing to accept as a new form of graphic design, writing, user interface design, art. In my industry, clients often cling to the old, while I strive for the new, ideally we arrive at median that satisfies the people who will be exposed to the graphic design, writing, user interface design, art–things.

As I continue to re-iterate and innovate, as best as my abilities allow me to, I’ve come across another term that has been around since the 1930’s; Umwelt. “The term is usually translated as “self-centered world”. In a sense, it means; to make meaning(s) possible for you. This is a very holistic approach to how we as people communicate, navigate and express ourselves using things we didn’t necessarily create ourselves or even need initially. Familiarity often breeds contempt. Learning something new or discovering a thing that may have been previously inexpressible or hard to understand, means a lot to us as people. It’s like a leap forward. The epiphany, cognizance, awareness–we feel like our IQ jumped 10 points.

Design or create with purpose, even if it’s end purpose wasn’t your initial intention. Everything we create is utilized, seen or felt by another human being. Skeuomorph and Umwelt–I’ll be working harder than ever to express these two words in my day-to-day activities.

Utility + Message = Experience ≠ Branding

Someone proposed the question; Is Consumer Generated Advertising the next big thing? And by someone, I mean someone on a participant network. I was a little confused by this question for a few reasons.

The fact that someone has asked this question in a quasi-public forum while having it answered by industry professionals and lurkers alike, is testimony enough. This survey is community-driven and in-effect, user-generated. It has been contributed to and advanced upon by many without provocation or payment. I think the term; Consumer Generated Advertising,(CGA) is over-defined or not implicit enough. Every brand should expect that all users will participate in brand development. And as technology for synchronized communication with ubiquitous access to information on home/mobile devices grows closer, we’ll see more participation. A brand that enables and contributes to its prospects voice(s) will see greater return on their marketing. Public Relations, as a term, has taken center-stage from the industry’s perspective as a way to market itself. But it has no more value than any other channel in the branding arena.

Utility + Message = Experience ≠ Branding.

Let’s also consider something else; This approach to marketing is not always the best approach for every brand. CGA isn’t a push, it’s an unexpected result. People creating their own How-To’s with products and services are doing it because they can. Not because they we’re asked to endorse something. This type of CGA, in itself, is nothing new. Everyday day we help others out. Especially with recommendations and demonstrations of our intimate understanding about a topic of some sort. If a brand provides a product, service or idea that merits discussion, it will happen with or without Public Relations.

Remove those mental barriers.
Mitigate crisis with a conversation.