De-Coder’s Ring

Consumable Security and Technology

Category: software development (page 1 of 2)

Technology: The first class citizen

I’ve spent the last few days at Capital One’s Software Engineering conference.   How cool is that?   Hundreds of techs folks gathering for a few days to discuss areas of technology.   These are modern stacks of technology, processes and new paradigms.

For me, I’ve been able to watch about a half dozen talks on Machine Learning, the programming language Go and encryption.  The speakers were excellent, and, if I play my cards right, I’m going to work to get a few of them on here as guest bloggers!

What topics would you want to hear about?

Security: Code and Passwords

Developer’s jobs aren’t easy.  Constant deadlines, integrating new technologies… dealing with ‘Ted’ in the cube next to you that shouldn’t be eating those onion rings… you get it..  lots of issues.  #notsnowflakes

stressed out developer

stressed out developer

Now we’re forced to live in this modern world of devops.  No longer can we rely on system administrators to maintain systems.  No longer can we rely on release engineers to package and ship our code.  Now we own it all.

Some of us adapt.  Don’t get me wrong, it’s not an easy task.  Most of us don’t have enough linux-foo, or the ingrained processes to maintain a large elasticsearch cluster… but we cope.  We learn new skills, grow in breadth of knowledge… then that breadth gets deeper.  Holy cow, we’re valuable now!

Unfortunately, security still is not a top tier concern for most software engineers.  We have web exploits to worry about.  We have to worry about SQL Injection.  Stack overflows, kernel panics, all kinds of neat stuff… each of which is the beginning of a piece of vulnerable software.

The one that continues to kill me, and I have this feeling was behind a major breach in the US this week, has to do with account and environment credentials.  There are so many scenarios that require an application to know about credentials:

  1. Database connectivity
  2. External API/Service
  3. Mail servers

tons more.  how do we deal with it?

There are a few anti-patterns

… bad things.. don’t do these.

  1. Hard code the credentials in your code
  2. Use a configuration file, check it into source control
  3. Use environment variables in your public facing website to connect to your super secret database

Those are all dumb.  Don’t do anything.

What can we do?

Separation of connectivity.   Your web application shouldn’t call your database directly, especially if it’s a database with customer data, personally identifiable information or healthcare info.  That’d be dumb.   Connect your web application to an API layer , but still follow some of the ‘other’ advice below.

Supply the passwords at runtime

Use a password vault/key management system to supply passwords to an application.  Build that out into your application framework so your code doesn’t have to be aware of where the password came from.   A password vault is a high security system that allows an authorized application to make a secure request to get private information from.  For instance, your vault could store the ‘production customer database’ information. This could even be information about the host name, port, username and password of the database.

Different environments get different credentials

This one is pretty obvious, but sometimes even the best of us don’t follow this to a T..   ummm…. no, not me.. others..  yeah.. others.   Just like your web sites, always have different passwords for everything.  Don’t reuse credentials in a QA environment and a production environment.

Provision as much as you can in configuration

Putting configuration items, or items that MAY become configurable in code is a bad move.  You’re gonna have a bad time.

You're going to have a bad time

You’re going to have a bad time

Always use configuration files. In the example above, the configuration file would tell your application where to find the password vault. Not the passwords or even the database configuration.

Act like your data is exploited

This point goes kind of against the other development tips.  When building applications, always remember that there’s a chance that the database ends up on the internet.   No one wants to think about it, but, look at Equifax.  Look at Deloitte.  Look at Aetna. Target.  etc.   They got owned, and you very well may too.   Don’t live in fear, but, live in paranoia!


Neo4J Tutorial – Published Video!

You may have noticed that my stream of thought posts on Neo4J.  It’s pained my, cause you know, drawing the balls is fun.

Today, I get to announce a published video tutorial on Neo4J by Packt Publishing!

We developed an in depth course, covering a bunch of graph database and Neo4J problems, ranging from:

  • Installation
  • What is a graph database?
  • comparing to a relational database
  • Using Cypher Query Language (Cypher, CQL)
  • Looking at various functions in Neo4j
  • Query profiling

It was a ton of work, over a few months, but, the support from Packt was great.  I’m really looking forward to getting feedback on the course!

Packt Publishing -  Learning Neo4j Graphs and Cypher

Packt Publishing – Learning Neo4j Graphs and Cypher




Simple Tip: Provision an Elasticsearch Node Automatically!

I built out a new Elasticsearch 5.4 cluster today.

Typically, it’s a tedious task.   I haven’t invested in any sort of infrastructure automation technology, because, well, there aren’t enough hours in the day.  I remembered a trick a few of us came up with at a previous bank I used to work for.  Using a shell script in AWS S3, that gets downloaded in a user init script in EC2, and bam, off to the races!

I won’t give away any tricks here, since my boss would kick me… again, but, since this processed was used heavily by me and team previously, I don’t mind sharing.

We didn’t use it specifically for Elasticsearch, but, you can get the gist of how to use it in other applications.

First step, upload the script to AWS S3.    Here, I’ll use an example bucket of “” – that’s my bucket, don’t try to own it.  for reals.

Let’s call the script “”

The provision file can look something like this:

You’ll see reference to an that’s super simple:

Made up a security group, etc… configure the security group to whatever you’re using for your ES cluster.. change the bucket to your bucket.

Each ES host needs a unique name (beats me what will happen to elasticsearch if you have multiple nodes with the same name.. they’re geniuses.. it’s probably fine, but, you can test it, not me).  Alternatively, try to use your instance ID as your node name!

Then your user init data looks super stupid and simple:

Add user data

Once you complete your EC2 creation, you can verify the output in:



Ship it!

Every time I forget the mantra “F(orget) It, Ship It”, things don’t go well.  Analysis Paralysis.  Develop towards a stale goal.

Historically, projects get bogged down for ages making sure it’s “perfect”.

Face it, it’s never perfect.  Ever.

This applies to software, companies, features, church activities and anything else that might be new and untried before.  Analysis and rework is the killer of new ideas.

I build products for people to use.  I know the data that my products use.  I know some of the pain points I’m trying to solve for customers (current and future!).  It’s SO easy to say “oh dang, let’s just add this one more XYZ widget before we call MVP”.  It’s easier to add new features than it is to declare a product “good enough”.


Yes, I did, and will again.  Nothing is ever perfect, and “good enough” is not a declaration on the quality/reliability/security of a new piece of software code.  It’s ‘good enough’ for someone to use.  This is why we strive for a minimally viable product, or MVP.

Counter that with the bad attitude: “good enough”.  That’s a statement on being lazy, not having professional quality standards and not giving a crap about what happens once something leaves your desk.  This is NOT what I’m advocating for.

Draw a line in the sand

Before you build, define your target. Define your MVP.  Define what is ‘good enough’ to your customer.   It can’t suck.  It has to add value.  It has to be easy (enough) to use.  It can’t be ugly, but it doesn’t have to be a work of art.  Ever see the first Google home page or the first version of Splunk?   Compare them to the current interfaces.  Good enough at work.



Older posts

© 2018 De-Coder’s Ring

Theme by Anders NorenUp ↑