PASS Summit 2015: The Recap

Another PASS Summit has ended, and again the days and evenings just flew by.  I’m not intending this to be a long-winded post, just some notes and a few highlights.  We’ll see how I do.

First, a huge thank you to those of you who attended my sessions on Wednesday.  I had wonderful crowds and great questions – it is such an honor to present at Summit, and I so enjoyed both sessions.  If you are looking for the decks and the scripts.  I have sent updated slide decks to PASS (I admit, I tweak them until the very end).  When they are uploaded you can pull them from the Summit site:

Kick and Screaming: Replacing Profiler with Extended Events

Statistics and Query Plans

As for the demos, they can be found on the SQLskills Resources page, under the PASS Summit 2015 section.  The link is also listed in the slide decks.

A few weeks ago I had mentioned that I was supporting Argenis Without Borders 2.0 by wearing a costume at Summit.  The cool news first: Argenis and Kirsten raised $25,000.  Yes I have the comma in the right place: TWENTY-FIVE THOUSAND DOLLARS.  Ah-mazing.  As for my costume?  I went as Perry.  You may know that I take a stuffed Perry with me when I travel and take pictures of him to send to my kids, so why not dress up as him?  It was great, and huge thanks to Jes (Wonder Woman in the photo below) for the help.  Sadly, you can’t see my orange shoes in these pictures.

Argenis (as Ted) and me

Argenis (as Ted) and me

 

Wonder Woman and Perry, supporting Argenis Without Borders 2.0

Wonder Woman and Perry, supporting Argenis Without Borders 2.0

 

With my sessions finished on Wednesday, I had time on Thursday and Friday to catch up with friends, meet some new people, and sit in on some sessions.  There are some fantastic features coming in SQL Server 2016, and I’ll be making time over the next few months to start working with a few of them, as well as getting better acquainted with recently-released features that are starting to mature.  I am not at MVP Summit this week, which is a bummer on multiple levels.  However, I have been gone for three out of four weeks in October, and being away from my family for another week would have been a bigger bummer.  Life is about choices and balance :)

On a final note, I saw on Twitter that David Maxwell was the winner of Speaker Idol, meaning that he will present at the 2016 PASS Summit (to be held October 26-29, 2016).  David has worked hard over the last couple years – presenting at user groups and SQLSaturdays – steadily working to improve and hone his skills and style.  He is a great example of a community member that decided to step up and start speaking, and now he’ll have a spot at Summit next year.  Congratulations David!  And to those of you who have ever sat in a session and thought, “I could do that,” or “I would like to try that,” then let me ask you, “What are you waiting for?”  There are many people in the community who are willing to help new speakers (including me), but you need to take initiative and reach out to them when you’re ready.  Who knows, you could be speaking at Summit, or another conference, down the road.

For those of you that were at Summit, I hope you catch up on sleep and email in the next couple days, and I hope to see you again next year!

 

PASS Summit 2015: Women in Technology Luncheon

It’s Thursday at the PASS Summit so that means it’s time for the Women in Technology Luncheon.  As in years past (I’ve lost count of how many), the luncheon is sponsored by  SQL Sentry.  The SQL Sentry team is here at Summit in full force, and I have Allen White at the blogger’s table with me.  But while I’m at it, let me give a shout out to a few members of the SQL Sentry team that have been supportive of not just this event, but of myself and some fellow colleagues.  These gentleman have provided feedback, suggestions, and good old fashioned support whenever asked or needed.  Thank you Aaron Bertrand, Kevin Kline, Nick Harshbarger, and Greg Gonzalez for all you do for me, my colleagues, and this community.

For those of you at home, you can watch the luncheon live on PASSTV.  Finally, if you want more rapid-fire commentary from the luncheon (as I’ll refresh this post every 5-10 minutes), I recommend following Mark Vaillancourt on Twitter (@markvsql).

Today’s luncheon features guest Angie Chang from Hackbright Academy, the VP of Strategic Partnerships, and we start with PASS Board VP of Marketing Denise McInerney welcoming us to today’s lunch (it’s the 13th one).

Angie starts by talking about her path from undergrad to her position today.  She started the Girl Geek Dinner chapter in San Francisco, and Hackbright sought her out to help celebrate the first graduating class of Hackbright.  Hackbright has graduated around 300 women over the past 3 years, and a few of those women now hold technical management positions.  Hackbright was started by some women who attended a coding camp.  The group started with an experiment of 12 women, teaching them to code in 10 weeks.  Since then they have grown the classes and the curriculum has evolved.  Right now teaching Python, and also teach some Java, Angular – they are taught to learn not just the language, but also ask questions.  Each engineering fellow has three mentors.  There are 100 software engineers who mentor those students for one hour a week.  This mentorship helps enhance the experience, and the students also get to visit other technical companies (e.g. Twitter, Dropbox).

Hackbright uses pair programming.  The community aspect is important – particularly because it’s an all-women environment.  The environment is very casual.  The students at Hackbright are very diverse and come from a variety of backgrounds.  Hackbright has a high rate of job placements.  Angie highlights some graduates of Hackbright who have been promoted to engineer management positions within their company.  SurveyMonkey has hired the most “Hackbright’s” of any company and one of the engineers is a manager there now.

Hackbright works with partner companies by inviting them to career day events and the Hackbright graduation.  Facebook sponsors a scholarship once a quarter, and Denise’s company, Intuit, also provides a scholarship.  Girl Geek Dinner started in London in about 2006, and Angie was working at a startup at that time.  Angie started up the Girl Geek Dinner in Mountain View, sponsored by Google – they had 400 people in 5 days.  They are booked into 2017 for dinners, with 2-3 per month.

Denise shifts to talking about the pipeline problem.  One Hackbright instructor, Rachel Thomas, wrote a post, If you think this is a pipeline issue then you haven’t been paying attention.  The article has suggestions for how to improve the pipeline – it’s not about getting women in, it’s about retaining them.  Denise asks Angie if she feels retention will be an issue for those graduating from Hackbright – and Angie states that they create a good network for each graduating engineer – their classmates at Hackbright, their mentors, etc. which gives each person a set of resources to turn when they’re struggling.

If you have questions you can come up to the microphone or use the #passwit hashtag on Twitter.

Documentary from Technovation called Codegirl which will stream on YouTube from November 1-5, check out the trailer.

Want to see if you have any unconscious biases?  Check out these tests on Harvard’s site.

PASS Summit 2015: Day 2

8:20 AM

We’re off and running with Adam Jorgensen, PASS EVP of Finance.  Adam takes this opportunity to provide an update about the financial status of PASS as this satisfies the requirements of the by-laws.  The largest source of revenue is the PASS Summit (not a surprise), bringing in just over 7 million dollars (of the 8 million generated in the 2015 fiscal year).  The finances continue trend upward, which is great.  Finances support the community through events all year long.  This year, 78% of every dollar taken in goes back to a community program.  PASS is in great financial health, increased reserves to 1.14 million dollars.  Starting this year, portfolio-level budget summaries will be published, to make the process more transparent to the community.  Last year goals for 2015 were to focus on support for SQLSaturdays and Chapters, among others.  PASS Summit will be in Seattle through 2019.  SQLSaturday website was relaunched this past year to help better support the events.  This year, goals include the BA Community Portfolio, refocus investments to community profiles, global growth program, sales portfolio, technology investments (including a re-design of sqlpass.org ELS: this makes Jes happy).  Adam wraps up by thanking Amy Lewis, outgoing board member.

8:33 AM

Adam finishes up and EVP Denise McInerney comes on stage.  Denise takes a minute to thank Bill Graziano, who is the outgoing Immediate Past President.  Bill has been a member of the board for 10 years.  ELS: I’m personally a big fan of Bill, I worked with him on the NomCom.

Denise moves on to the PASSion Award.  There were 71 Outstanding Volunteers this past year.  This year’s PASSion Award goes to Lance Harra.  He runs the Kansas City SQLSaturday and was an integral part of the program committee.  If you are interested in becoming a part of the SQL Server leadership team, stop by the Community Zone this week.  There are always ways to get involved with PASS.

There are over 150,000 members of PASS.  There are 3000 people from over 95 countries tuning in live.  Yesterday PASS introduced foundation sessions, which were offered by Microsoft (four of them yesterday).  Over the years PASS has grown its offerings to meet its members needs – virtual chapters, 24 Hours of PASS, SQLSaturday, user groups, and more.

Today is the Women in Technology lunch (11:45) sponsored by SQLSentry, and the keynote speaker is Angie Chang.  It will be live streamed on PASSTV.  Today is the Board Q&A at 3:30.  Tonight is the Community Appreciation Party at the EMP Museum at 7 PM.

PASS Summit next year is scheduled for October 25 – October 28 – early bird pricing is available!

Today’s keynote speakers are Dr. David DeWitt and Dr. Rimma Nehme. (ELS: TWO OF MY FAVORITES!!!)  They are both at the Microsoft Jim Gray Systems Lab in Madison.  Data Management for the Internet of Things.

8:45 AM

Dr. Nehme takes the stage.  She mentions that it’s harder to present a keynote together than individually.  She will start, Dr. DeWitt will come in, then Dr. Nehma will wrap up (dessert!).  What, why, how and of IOT.

Disclaimer: not announcing a product.  Goal is to inform, educate, and inspire (and entertain a bit).

Wants to begin with a new reality.  Things around us have a voice that can communicate to us.  IOT is a collection of devices and services that work together to do something useful.  Basic formula: take a basic object, add controller, sensor and actuator, add the internet, and then you get the internet of things.

Take the sensors and actuators, add connectivity and big data analytics, and then you can provide new services and optimization.  The target is to create value (make money).  What does that typically look like?  Collect data from sensors, aggregate it, analyze, then act on it. This is a continuous loop.  There are 2 types of IOT that people agree upon.  On one side have a consumer internet of things – things that are wearable, related to us as humans (phone, watch, etc.) then have things that are industrial (cars, factories, etc.).

Consumer IOT: fitbut, Nest, Lumo.  What can they reveal about us?  Health info, house information, driving habits.  You can analyze that information and make predications/revelations.  The Industrial Internet of Things (IOT) can be connected, and then significant value can be realized, particularly in Industry.  It is still in its infancy.  There are four types of IOT capabilities: Monitoring, Control, Optimization, Autonomy.  The analogy of this to human development..  We are in the “terrible twos” of the IOT development.  Why IOT?  We are at the peak of the hype right now (based on Gartner).  There is a growth of “things” connected to the internet.  In 2003 had about 500 million devices connected to the internet.  Have 12.5 billion by 2010.  Around 2008, the number of things connected to the internet exceeded the number of people.  In 2015, at 25 billion things connected to the internet.  The value to customers is huge.  The power of 1% – if you can improve 1% in fuel savings in an industry like aviation, health care, or power generation, that’s extremely significant.

Why is this happening now?  More mobile devices, better sensors and activators, and BI analysis.

For IOT How?, Dr. DeWitt comes on stage.  Dr. DeWitt is going to talk about the services available.  There are a lot of challenges – a large number (and variety) of sensors.  There are A LOT of devices sending data.  Sensors are frequently dirty, and it’s hard to distinguish between dirty readings and anomalies.  And then there is just the volume of data that’s being sent into the cloud.  One of the biggest challenges is device security.  How do you prevent them from overwhelming cloud infrastructure or impersonating a device?  And then there’s cloud-to-device messaging.  Sometimes the device is not online.  Therefore the device may miss a message, so persistent queue and reliability is needed.  How do you deploy this and get the IOT set up?  We’re not going to tackle that today.

There are differences between consumer and industrial IOT.  In consumer IOT have to worry about battery and power failure, more cost-sensitive, and might be a simple embedded device, or it could be a powerful sensor, and finally, consumers have wireless (industrial has unlimited power, full-fledged, wired, and depends on needed functionality).  Rest of talk will focus on industrial.  Note: one size fits none.

Today’s IOT: Just Do It Yourself.  The state of the art is still rather primitive.  What are the ingredients that go into IOT?  The basic block diagram, out in the field you have devices with a sensor and actuator (e.g. sense temp, humidity, in a Nest thermostat).  Up in the cloud, have event/data aggregator.  Device to Cloud (D2C) is how the data gets from the device up to the cloud.  You can feed this data into an application, into event/data storage, into a real-time processing engine (real time), and that *can* use a device controller and send it back to the device (C2D = Cloud to Device).  Azure IOT services exist.  Two main components: Azure Iot Hubs and Azure Event Hubs.  The data management is done through Azure Stream Analytics, DocumentDB, SQL Azure and SQL-DW, Azure HDInsight and Azure Machine Learning.  and then use PowerBI and Excel to visualize the data.

Azure IOT Hub (an Azure PaaS Service), this is the cornerstone of IOT.  It receives events and routes them.  It is scalable to millions of devicees, and it provides per-device instance authentication.  It can send commands back to the devices.  Within the hug, every device has it’s own send endpoint, to which the sensors will send events.  On the output side, is a set of partitions, into which data gets routed.  The number of partitions is created when the service is created in the cloud.  A hash function routes it to a partition.  Event consumers then “pull” events from the Receive EndPoint.  There is a C2D Send Endpoint that can send messages out, and then get routed to a message queue that guarantee once delivery out to the device’s actuator.

One thing you can do with events is pull them out of the IOT HUB and they go to the Event Consumer such as SQL Azure (doesn’t have a nicer sexy symbol like SQL Server), into HDFS, into Azure Storage, or into DocDB (these are examples).  Analyzing the events, then, can be done via SQL Server, or use SQL-DW and Polybase, Hadoop from HDFS (or Hvie/Storm), or DocDB.  All of these are great opportunities to store events.  A neat thing to do with IOT data is LEARN from it (e.g. when the boiler might explode).

Options for real-time query engine include Azure Stream Analytics or Apache Storm on HDInsight.  What’s a real-time query engine?  Traditional RDBMS with data on disk, send in a query, get data back.  In Dr. DeWitt’s mind, the real-time streaming is taking a sequence of events, and some queries that will operate over those events, and the query will find IDs of boilers that are about ready to explode based on PSi.  As query processes stream events, it will eventually produce results.  Can have multiple queries operating over the same set of events, or different streams.  Dr DeWitt encourages us to learn about stream analytics.

There is no data stored, the queries are just continually running, data flows through the query, outputs results.  When you see something important, what do you do?  Send a message to IOT hub to do an action (e..g open pressure release valve).  Field gateway – Raspberry Pi, running Windows 10, has WiFi – that’s a field gateway.  There are two primary use cases: when a sensor/device cannot itself connect to the internet, or for complex objects (e.g. smart cars) with multiple sensors/actuators.  Two flavors: opaque (only field gateway has identity in IOT hub) and transparent (each device is registered in IOT hub.  The field gateway are processors with memory and processors.

How to manage IOT metadata?  per-device metadata is not stored in a database system at present time so no query support.

Device security is super critical for IOT deployment.  Devices must have unique identities, and must PULL to obtain C2D commands (no ports open to reduce attacks).  Main takeway: it is PUSH to the cloud.  All the IOT events get pushed up into the cloud.  It was a good first effort.  But what are the problems with pushing everything to the cloud? Not enough bandwidth, requires connectivity, latency, data deluge (from boring sensor readings), storage constraints (storing EVERY event), speed, main point: wastes network bandwidth, computational resources, storage capacity and bandwidth processing for NON-INTERESTING events.

Go back to boiler example…Running the same query over and over, waste bandwidth sending the reading every second.  Centralizing all data from multiple systems might overload the system.  Here is their insight: exploit the capability of the field gateway.  It can do local processing and control.  Have the boiler with sensor and actuator.  Then you have a field gateway, and in that, going to run a streaming database system, and install on that boiler gateway control program, and run data through.  If run streaming engine there, can run any number of queries, might send average pressure reading for 60 seconds of data up to the IOT.  This is a better approach – reduce what pushing up to the cloud, and what needs to be stored.

How can we do better?  Dr. Nehme comes back on stage…  (she has changed her outfit…but don’t tweet about it…she’s a jeans and tshirt girl (I KNEW IT)

Fog computing – all about computing on the edge.  It is not cloud vs. fog, it is cloud + fog.

What’s the fog?  It’s like “predicate pushdown”.  Never move the data to the computation, move the computation to the data.  Devices perform some data pre-processing and compression, the cloud is a big gorilla that can do the management, processing, and machine learning.  How can we do better?  Real-time response, scalability, metadata management, GeoDR of IOT hubs.  IOT is a database problem, not just a networking problem.  It hasn’t been database-centric before, but trying to address that.

Want to take existing IOT Azure services and expand on them.  Proposing Polybase for IOT (not a product announcement, just an idea).  What is vision? Declarative language, complex object modeling, scale able metadata management, discrete and continuous queries, multi-purpose querying, computation pushdown.

Declarative language: if dealing with IOT, only choice is to use imperative language.  Have to explicitly specify how you want to see something.  What about IOT-SQL?  A declarative language where you can select information from the sensors.  If have tables specified as buildings, room, temperature sensors, etc.  With temperature sensors, have columns that looks like regular database.  Need to figure out how to model complex objects – for example, a room on a floor in a building, – need a model for this.  Have a notion of a shell database – it is a regular database that stores metadata, statistics, and access privileges – can perform authentication, authorization and query optimization against that database.  As far as these processes are concerned, they don’t need the actual data.  Now expand this to the devices.  The IOT shell also gives a simple abstraction for sensors, actuators, and distributors.  The shell can be stored in SQL Azure, DocDB, etc.  It’s JUST a database.

What about querying devices?  One query is ExecuteOnce: push select to device, it sends results, we’re done.  ExecuteForever, push SELECT to device, then the device continually sends results back to client.  When done, send signal we’re done and query stops running.  Then have ExecuteAction: send a SELECT and then an action, and the action gets fired when predicate is met.  Can do execution once, or forever.

Back to temperature sensor table…need some delcarative queries.  ExecuteOnce – get the count of all hot locations.  The optimized plan is generated, data is moved, and then work is done up in the cloud.  Not a lot of pushdown here.  ExecuteForever query – record all hot locations up in the cloud, and execute forever, the optimizer might produce a different plan (does some partial aggregation before pushing data up into the cloud – larger computation is done in the “fog”).

ExecuteAction: turn on AC in all the hot locations.  Larger computation and the action is pushed down in the fog, and only interesting events are pushed up into the cloud.  Multi-purpose query – based on results, some could go to one location, some could go to another location.

The Polybase for IOT Wrapup – use SQL front end with Polybase for sensor/actuator metadata management and querying.  Exploit Polybase’s external attribute mechanism to allow SQL queries to reference sensor values…and then one more thing I didn’t get :)

Why should we, as data professionals, care?  When a new technology rolls over you, you’re either part of the steamroller or part of the road (didn’t get the attribute).  Key takeway: the amount of data to manage is exponentially going up.  Need to step back to see what success looks like.

Dr. Nehme has announced that this is their last keynote.  Why? Dr. DeWitt…they have done 7 of these.  There are a lot of great speakers at MS, and he is sure there are people who are better speakers.  Dr. DeWitt and Dr. Rehme are “parting ways”.  She is finishing up her MBA and moving on.  Dr. DeWitt is starting to think about retirement.  After 40 years thinks it’s about time to give up the full time gig.  In 10 years…  Have not seen the last of Dr. Rehma – whether it’s at Microsoft or at a competitive.  Dr. DeWitt says this has been one of his brightest spots in his career.  He says it’s been a terrific experience.  He will think about this community for many years to come.  (ELS: I admit, I’m a little teary.)

PASS Summit 2015: Day 1

Well friends, here I sit again, at the blogger’s table, ready to kick of Day 1 of the PASS Summit here in Seattle, Washington.  My trusty side-kick, Perry, is with me as usual, and we’re joined by Bunny this year as my 8 year old insisted I am bring them both. Who I am to argue with her?

Some notes about today

It’s National Chocolate Day!  I plan to celebrate all day :)

I present twice today!  My first session is right after the keynote: Kicking and Screaming: Replacing Profiler with Extended Events in Room 6A from 10:15 to 11:30 AM.  Note that this session was a little later in a different room than originally scheduled.  My second session, Statistics and Query Plans, is from 3:15 PM to 4:30 in 6B.  I hope to see you at one of my sessions, feel free to come up and say if we haven’t met before (or if we have!).

Today’s keynote, Accelerating Your Business With a Modern Data Strategy, is headlined by Joseph Sirosh who is a Corporate Vice President in the Data Group at Microsoft.  And we’re off…

8:21 AM

Up first today is PASS President Tom LaRock.  This is Tom’s last year as President, he’ll next step into the role of Immediate Past President.  He mentions #SQLFamily and says that everyone is free to give him a hug.  That could be a lot of hugs.

Attendees from over 58 countries…over 2000 companies are represented, and the Microsoft team will be everywhere this week – stop by the SQLClinic if you have any questions you need help with.

For those of you not here, please follow along on  PASS TV, just head over to the main page for Live Streaming.

Tom introduced the PASS Board of Directors and encouraged members of the community to talk to the board this week to help them understand how to serve the community better.  Tom mentions the Board Q&A on Thursday at 3:30 PM in 307-308.

There are 5,500 total registrations this year for Summit (note: that’s not individuals…if you register for a pre-con and the conference, I think that’s 2 registrations, not 1).  Tom asks for a show of hands from newcomers…there are a lot.  ELS: Those of you who are here for the first time, try to meet people!  If you’re an introvert, I know that’s hard, but take a risk!  Say hi, find something in common!

The SQL community is the gold standard for technical communities.  ELS: I don’t disagree, I have friends in other technical disciplines, and they have nothing like what we have.

There are over 200 sessions and workshops this week.  Use the mobile app Guidebook to stay on top of any schedule changes.  On Twitter follow along with hashtags #sqlpass and #summit15.

The Birds of a Feather lunch will take place during lunch on Friday where you can talk to people with an interest in a specific feature/area.

Don’t forget out Sponsors who make this entire event possible.  There are some fantastic companies that support the SQL Server community.  PLEASE make time to go talk to them this week.  The Exhibitor Reception is tonight, after regular sessions end.

Tom closes by saying how proud he is to be a member of the #SQLFamily community.  He’s been a member since 2004.  I think he’s getting a little choked up.  Oh.  HUGS TOM!

 8:37 AM

Joseph Sirosh takes the stage.

We live in an age of data.  The ability to extract that data and use it is changing our daily lives.  All of the worlds data was analog 30+ years.  Then we got DVDs and such which started to digitize data.  When the internet came along, data suddenly had an IP address.  Connected data can be moved around and joined with other connected data.  Which means you extract intelligence from it.  The vast majority of today’s data is digital.  Much of that in the cloud.  Fast forward to 2020, there will be 50 million petabytes of data, mostly in the cloud.  Fifty years ago, hardware drove new customer experiences.  Then came the age of software.  Digitizing everything.

In the new world, data will predict everything.  We can use this data to develop models so that when, for example, people come in to the ER with a problem, you can put in data collected and use a model to determine a path of care.

Joseph brings up Eric Fleischman who is the Chief Architect and VP of Platform Engineering at DocuSign (we use their site!), they chose to use SQL Server because they believed Microsoft would be there for them (and they have been).  They made an investment into the telemetry of the system that process millions of data points about the performance of the actual system.  That system is scaling literally to the OLTP system.  There are some improvements in the HA/DR stack for them in 2016, along with the encrypted features.

SQL Server 2016 is meant to be the all engines of data that you can build your data on – both in house and in the cloud.  Innovate first in the cloud with an accelerated speed (push new code once a week).  The pain in this system translates into changes in software very quickly.  When you build and operate in the cloud, you take innovation and bring it back to packaged software (SQL Server 2016).  Companies like Oracle cannot claim that…who build locally and then ship to the cloud.  Amazon will state they are only in the cloud, it’s a cloud-only feature. But “we” know better.  Feet on the ground and head in the cloud.  You have to build products to operate both in the sky and the ground.  Microsoft is the only company to do that.  This community is making Microsoft number 1 in the age of data.

HUGE STATEMENT from Joseph.

There’s a video with some feedback from fellow MVPs about SQL Server 2016…  Joseph turns the stage over to Shawn Bice (General Manager, Database Systems Group).  Haven’t shipped 2016 yet, but it powers everything in the cloud.

Seven big bets…all of these are built-in.

From OLTP perspective – SQL Server is recognized as a leader.

SQL Server is the most secure database.  SQL runs some of the most scalable data warehouses in the world.  Mobile BI is built into SQL Server, it’s about that mobile workforce and getting visualizations to them.

First big bet: HA/DR.  Have learned a lot from partnership with DocuSign.  DocuSign is using some of the fastest IO subsystems with FusionIO.  Have A LOT of data moving across the wire to secondaries.  Have done a ton of work with algorithms to improve updates.  Have customers that use Azure along with on premise all the time.  You can enroll an Azure DB with an on-prem system to create a DR site.  For all of you that have used DB Mirroring: want to use an AG but can’t domain join it.  In SQL 16, can stand up HA environment, don’t have to domain join anything.  Woohoo!  Introduce load balancing around read scale, so don’t have to point clients to every secondary.  The stack for on prem is the same that’s in Azure, and they do failovers every day.  They had a data center that was on fire and failed over every customer in China in about 5 hours.

Ok, I need to get to my first session, I’ll be back tomorrow!  Have a great day!

 

Costumes, PASS Summit, and Argenis Without Borders 2.0

I’ve never been a fan of Halloween. Obviously not because I don’t like candy or chocolate (you all know I like sweets right?).  It’s because I dread the costume. This wasn’t a problem as a kid. My costumes included, in no particular order…

  • A mummy (though I got tangled in a couple trees due to my bandages)
  • A clown (Tim Chapman take note)
  • A crayon (not a lot of flexibility in that costume)
  • R2D2 (I was a nerd even when I was young)
  • Princess Leia (from Episode IV, not the bikini costume)
  • A robot (hard to walk through doors while dressed in a box)

Those were fun times. Then I got to college and Halloween took on a whole new meaning. I am not one to dress up in any tight-fitting barely-there costume, so I resorted to a farmer one year (insert Iowa joke here) and other years just avoided going out all together. Finding something cool was tricky. My friend Jori was awesome at costumes, she still is (last year she, her husband, and their 1 ½ year old went as the Three Amigos…genius). But I digress.

So this year, when Argenis Fernandez asked if I would participate in his second fundraiser for Doctors Without Borders and wear a costume if he raised at least $5000, I of course said yes…knowing full well I would have to find a costume.

Enter my friend Jes.

Yes Jes, yes it is...

Yes Jes, yes it is…

I now have a kick @$$ costume if I do say so myself, and you’ll see it at the PASS Summit (I’m speaking on Wednesday at 4:45 PM, if you’re interested!). In the meantime, if you haven’t already, head on over to Argenis’ page and donate to a fantastic cause.   The amount doesn’t matter…$5, $10, $20…whatever. It all adds up, it all helps. Every little bit.  Thanks for considering, and I hope to see you at Summit!

In Support of PASS Board of Directors Nominee Ryan Adams

Reminder: Opinions are my own.

There are four candidates running for three Board of Director positions this year (in alpha order):

  • Ryan Adams
  • Argenis Ferndandez
  • Tim Ford
  • Jen Stirrup

You can read more about each candidate through links on PASS’ Elections page here. I do know all four candidates personally – some a bit better than others – and I know that they are all great individuals who have all contributed in numerous ways to the SQL Server Community over the years.

But this post is specifically about Ryan Adams and why he’ll be getting my vote for a Board of Directors position. I support Ryan because I don’t just know him, I have worked with him. I had two opportunities to work closely with Ryan in the last three years on PASS-related teams.

I first got to know Ryan in late 2011 when I took over the PASS Performance Virtual Chapter. I put out a call for volunteers in November 2011 to help run the team. Ryan applied with the intention of managing the web site , but I recognized that he came to the group with a lot of experience already (at that time he was on the board for the North Texas User Group, he was a Regional Mentor, and he was helping to organize SQLRally), and I needed to leverage that. I convinced him to handle marketing for the group, and though we didn’t know each other well in the beginning, our small team came together through regular conference calls. We all had different responsibilities, and we were off and running. We created (in my opinion) a very solid VC over the course of the year. Ryan was an integral part of that success. He created his own goals and figured out how to meet them. He set up new avenues for marketing the VC meetings, and came up with some great ideas along the way, including the Performance Palooza. We held the first one in December of 2012, and the Palooza has continued to this day for the Performance VC (its attendance this year grew 175% compared to last year).

At the end of 2012 I stepped down from running the team, as I had only committed to a year, and I also felt it was good for someone to have another opportunity in a leadership position. Ryan stepped forward without hesitation; he was a natural fit. This was confirmed when we finally met in person at the 2012 PASS Summit for a dinner to celebrate our year, and Ryan was already thinking ahead to 2013 and making plans. Ryan has continued to run that group and it has continued to grow over the years, now averaging 200-300 attendees per session.

I again worked with Ryan in 2013 when we were both members of the PASS Nomination Committee. I was again impressed. The NomCom requires its members to review applications by a specific date, and also attend meetings and candidate interviews. Collectively, we all tried to meet deadlines and attend every meeting – I’m not sure that Ryan missed any. During our candidate interviews Ryan asked thoughtful questions, and provided great insight and evaluation during our follow up discussions. Post-nominations, he gave critical consideration to the process that we went through provided both big-picture and detailed analysis of it, contributing to the changes that occurred for our nominating committee and the next.

In the last three years I have watched Ryan continue to contribute to the community through his volunteer responsibilities and his speaking. He’s become a solid speaker, presenting at many SQL Saturdays and the PASS Summit. He is comfortable in front of large and small groups and has excellent written and verbal communication skills. He has in-depth knowledge about PASS and how it works, which he’s been accruing since he started helping out the North Texas User Group all those years ago. Ryan has the ability to take on projects and responsibilities, take action, and see them through the end. He gets thing done. He has a positive attitude and he’s a critical thinker. I know Ryan would be a solid member of the PASS Board of Directors, and it’s been a privilege to watch him grow as a leader within our community.

Best of luck to all the candidates in this year’s election, and I encourage every voting PASS member to take time to learn about each candidate so you can vote based on data.  Read through each candidate’s page on the PASS site, and if you still have questions, visit the election forum and/or attend the Town Hall and Twitter chat events.  The board makes a lot of decisions that affect the SQL Server community – it’s a good idea to know who’s representing you.

Don’t Hard Boil Your Next SQL Server Upgrade

SQL Server upgrades and hard-boiled eggs. What do these two things have in common? Probably nothing, but last week Jon and I were having a conversation about an upgrade I need to do for one of our clients. Now I can’t go into specifics because of the NDA, but I can give you some of the pieces:

It’s a two-node cluster, there are two instances of an older version of SQL Server Standard Edition, they want to upgrade to a recent release of SQL Server, there’s replication involved…and we may or may not reuse the existing servers.

Now, I like talking through different options for this kind of challenge with someone else on the team, and Jon loves this type of thing. I love solving problems, it’s one of my favorite parts of my job. I think that Jon thrives on it. We’ll discuss some project or process and I will have an idea for how to do it, and then three hours later I’ll get another email or a phone call from Jon that starts with, “Well you could also…”

Our conversation about upgrading was interspersed with a conversation about getting into better shape and eating better (yes I’m usually going in three directions at once, aren’t we all?), and Jon mentioned hard-boiled eggs because they are a good breakfast protein. I asked him if he was still buying them already boiled…because he used to do that. He said no (hooray!), that one of his friends now boils them for him in big batches (nice friend!). Then he mentions how difficult it is to boil eggs.

And I’m thinking, “Are you kidding me?” He’s one of the smartest people I know and boiling eggs is a challenge? So I sent him the steps below, which is this which is an adaptation of instructions provided in Cook’s Illustrated (great cooking magazine if you’re interested):

  1. Place eggs in a large pan and enough cold water so that it’s at least 1” above the eggs. Add a splash of vinegar.
  2. Place on burner and set to high.
  3. When the water starts to boil, set a timer for 5 minutes.
  4. When the timer goes off, turn off the burner, set another timer for 7 minutes, let the eggs sit in the hot water.
  5. When the timer goes off the second time, drain the hot water, fill with cold water and ice.
  6. Let sit for a couple minutes, then either drain and put in the fridge, or peel and eat.

And Jon replies with this (yes I know the image is large, it’s so you don’t have to click on it to read the text! :)

Upgrading is easier than boiling eggs

Upgrading is easier than boiling eggs

I laughed. A lot. I was thinking, “I’ll take boiling eggs over an upgrade any day, no matter how much I love SQL Server.”

And then yesterday I decided to boil a couple eggs for breakfast. I put them in the pan with plenty of water and a splash of vinegar, turned on the burner, and then went to my office (just 20 feet away) to work on my Insider Video. Forty minutes later (yes, 40) I remembered the eggs. I forgot to set a timer. I always set a timer, but yesterday I just forgot. I ran to the kitchen…there was about an inch of water left in the pan. I was lucky. Nothing destroyed, the eggs were even edible! But here’s the thing…

It doesn’t matter how many times you’ve done something before, and how well you know the steps involved and KNOW that they work. Sometimes unexpected events happen or a single step gets missed that causes a problem (or a disaster).

Need a real-life example? Jon had a customer that wanted to remove partitioning for a huge table. They had gone through the steps many times in a Test environment, and they were all documented. When it came time to run the process in Production, however, the step to disable the nonclustered indexes on that table before removing partitioning was missed. As a result, the nonclustered indexes (on that extremely large table) were rebuilt three separate times. The step was in the documentation, and had been done during the tests, but for some reason wasn’t done during the process in Production.

So back to that upgrade…we’re still planning for it. If I get approval from the customer, I’ll share the details and let you know how it goes. Until then, keep making those checklists and testing your upgrades. You just never know what’s going to happen.

AUTO_CLOSE and the SQL Server ERRORLOG

Today I opened up a SQL Server ERRORLOG and saw these two messages repeated every 20 seconds or so:

Starting up database ‘AdventureWorks2014′.

CHECKDB for database ‘AdventureWorks2014′ finished without errors on 2015-08-23 02:15:08.070 (local time).  This is an information message only; no user action required.

When you initially see these two messages repeated over and over, it might seem like SQL Server is caught in some issue with recovery.  Or you might think it’s running CHECKDB over and over.  Neither are true.  The database has AUTO_CLOSE enabled.  (And you see the CHECKDB message because it’s reading the boot page and noting the last time CHECKDB ran successfully…to see what updates that entry, check out my post What DBCC Checks Update dbccLastKnownGood?)

When AUTO_CLOSE is enabled, after the last user exits the database, the database shuts down and its resources are freed.  When someone tries to access the database again, the database reopens.  You might be thinking that for databases that are not accessed that often, this might be a good thing.  After all, freeing resources and giving them back to SQL Server for use elsewhere sounds useful.  Not so much.  There’s a cost associated with that shut down, and a cost to open the database back up when a user connects.  For example – shutting down a database removes all plans for that database from cache.  The next time a user runs a query, it will have to be compiled.  If the user disconnects, the plan is freed from cache.  If someone connects one minute later and runs the same query, it has be compiled again.  You get the point: this is inefficient.  And really, how many databases in your production environment do you really not access?  If you’re not accessing the database, why is it in a production instance?  If you want a few more details on AUTO_CLOSE, check out the entry for ALTER DATABASE in Books Online.

I am sure (maybe?) that there are valid cases for having AUTO_CLOSE enabled.  But I haven’t found one yet :)

On top of the resource use, realize that every time the database starts up, you’re going to get the above two messages in the ERRORLOG.  In the log I was looking at, there were multiple databases with this option enabled, so the log was flooded with these messages.  In general, I’m a huge fan of cycling the ERRORLOG on a regular basis (just set up an Agent job that runs sp_cycle_errorlog every week), and I try to reduce “clutter” in the log as much as possible.  This means don’t enable a setting like AUTO_CLOSE which can put in all those messages, and use trace flag 3226 to stop logging successful backup messages (they still go to msdb).

Oh yes, to disable AUTO_CLOSE:

ALTER DATABASE 'AdventureWorks2014' SET AUTO_CLOSE OFF;

GO

 

Backup checksum default option in SQL Server 2014

The SQL Server team snuck in a new server configuration option in the 2014 release (I bet thanks to this Connect item even though it’s still Active), and it’s not documented so I just stumbled upon it recently.  If you run:

SELECT * FROM [sys].[configurations] ORDER BY [name];

you’ll see that there are 70 rows in the output (in 2012 there were 69) and the new one is:

backup checksum default

The option is disabled (set to 0) by default.  To enable it, simply run:

EXEC sp_configure ‘backup checksum default’, 1; GO RECONFIGURE WITH OVERRIDE; GO

As a quick reminder, adding the CHECKSUM syntax to the backup command forces SQL Server to verify any existing page checksums as it reads pages for the backup, and it calculates a checksum over the entire backup.  Remember that this does not replace CHECKDB (check out Paul’s post – A SQL Server DBA myth a day: (27/30) use BACKUP WITH CHECKSUM to replace DBCC CHECKDB – for more details).  So what does this server option do?  Well, since it’s not yet documented (I filed a Connect item here) I did some testing to confirm what I was expecting.

Within Management Studio I ran a simple backup statement:

BACKUP DATABASE [AdventureWorks2014] TO  DISK = N’C:\Backups\AdventureWorks2014_checksumtest.bak';

Then I checked the output in msdb:

SELECT [bs].[database_name], [bs].[backup_start_date], [bs].[backup_finish_date], [bs].[has_backup_checksums], [bs].[user_name], [bm].[physical_device_name] FROM [msdb]..[backupset] [bs] JOIN [msdb]..[backupmediafamily] [bm] on [bs].[media_set_id] = [bm].[media_set_id];

Backup information from msdb

Backup information from msdb

 

Check it out…with the server option enabled, I don’t have to include the CHECKSUM syntax to have SQL Server perform the backup checksum.

If you’re running SQL Server 2014, I highly recommend enabling this option for your instances, and if you rely on Glenn’s scripts for instance reviews, I know he’s adding a note about this to the next set :)

PASS Summit 2014: WIT Lunch

Two posts from me in one day?  What’s up with that?!  Well, today at the PASS Summit we also have the WIT luncheon, where Kimberly Bryant, who is the founder of Black Girls CODE, will be speaking.  I am live-blogging this event as well, so watch this post for updates starting around 12:15PM EST.  If you want to learn more about Black Girls CODE, check out this MSNBC interview.

12:15PM

Denise McInerney is introduced first – she asks how many people were at the first WIT lunch, back in 2002 (I am pretty sure Denise has been a part of WIT since its inception – a long-time leader within the community).  The WIT luncheon has grown a lot in the past 10+ years – today’s lunch has over 900 attendees.  Denise brings out Kimberly Bryant – such a different setting this year, just Denise and Kimberly on stage.  Denise is going to ask a few questions, then open it up to the audience and people watching on PASStv – you can tweet your questions and include the #passwit hash tag.

Black Girls CODE is a non-profit organization started in the Bay Area in 2011.  What really drove Kimberly to make a change when she recognized that her daughter, who was 12 at the time, might be following in her footsteps.  Never thought her daughter was an engineer.  But she was a heavy gamer (World of Warcraft, D&D) and spent a lot of time on the computer.  Her daughter was at the age where she could learn and create with a computer – and that was a life-changing moment for her.  Her daughter first wanted to grow up and be a game tester :)  Once she went up to a programming camp, she saw that the environment actually allowed her to create, not just be a participant.  As a parent, Kimberly noticed that she was only one of three girls at the summer camp, and the only person of color at the camp (out of about 40 campers total).  At that point, Kimberly knew she had to make a difference, not just for her daughter, but for other daughters.

Question from Denise: “Why is still so hard to get girls and young women interested in technology?”  Kimberly cites a Girl Scout study that showed tha tif you surveyed girls BEFORE they get to middle school, over half the girls show an interest in STEM, but by the time they get to high school it’s less than 5%.  In some cases, girls don’t have support from parents and teachers.  There are fewer opportunities for girls to flex the STEM skills.  Kimberly says she hates the pink aisle.  Do Legos need to be pink?  (ES: They don’t, I grew up without pink or purple Legos and played with them all the time.)

What type of programs does Black Girls CODE run?  The secret sauce is the environment of girls in the environment to do coding and engineering and they have relatable leadership – the women that come in to teach the programs.  Over 75-80% of the instructors are women.  They are reflections of what the girls can become, and that gives the girls the ability to see the possibility.  Kimberly had a counselor who said, ‘You’re good in math and science, you should go into engineering.”  Kimberly didn’t know what that looked like – what does an engineer do, what do they look like?  But if you’re able to actually see that, suddenly you have an idea of what you can really do.

Denise asked what languages are taught via Black Girls CODE?  Kimberly explained that in the beginning they didn’t know what the girls would be willing to learn, so it was open in the beginning.  The goal was to always teach them Ruby – and Kimberly had a core team that knew Ruby.  Also did some testing with Python, but do a lot with open source learning.  Have also started to talk to organizations about coding – she has talked with Lynn Langit, and Lynn’s program (Teaching Kids Programming) teaches Java.

Question from Denise: How can people who want to bring STEM education to kids get started doing that?  There are so many opportunities for technology professionals.  We are at the beginning of this code movement – but we are lacking in teachers that can teach these skills.  Look for opportunities to give back in the school district where your kids are.  We need more than after-school programs and camps.  Black Girls CODE has over 2000 volunteers across the US, there are multiple chapters.  There is a need to talk to students and parents about what we (as women) do in our careers.  (ES: I find it interesting that she mentioned that parents need to hear that discussion as well.)

Kimberly believes that kids can start learning about technology at grade 1.  (ES: I agree – my kids have had a tech class since kindergarten.  Last year, as a 3rd grader, my son put together a PowerPoint presentation.)  Starting to introduce technology in high school is too late.  We need computer science to be counted as a high school credit – it shouldn’t take the place of math or science, it’s in addition.

Denise: Many companies have released diversity statistics.  Does Kimberly talk to attendees about the culture of tech and what it might be like to have a career in tech.  Kimberly states that they do – they try to prepare students to be active participants, and also prepare them for what challenges they might face within the data environment.  Changing the community is not quick thing – it’s a continual effort and requires some difficult conversations (then followed by action).

Over half of the women who enter tech fields drop out at the half way point.  Kimberly says she can relate to this personally.  She understands what it’s like to get in to the career and then see the glass ceiling.  Often, women don’t have the support network to break through that glass ceiling.  The role of mentors, sponsors, and advocates is so important.  On average, most women CEOs come to one company and stay there for 20+ years – that’s how the majority of women CEOs get there.  Women need to stay in the pipeline longer in order to get to the top – but a welcoming and positive environment will help.  Advocates and sponsors for women, within those communities, are needed.  Need more male advocates and mentors to help women get to the next level.  Also, women need to be willing to take the risk to get to that next level.

Denise opens up the discussion for questions from the audience.  It’s mentioned that only 15% of attendees here at PASS Summit are women.  (ES: Really?  15%?)

One of themes from today and Kimberly’s message: mentors are needed.  Both and male and female.  Kids need role models, college students need them, women in technology need them.  (ES: I’d argue that everyone needs a mentor.  Ask yourself: can you be a mentor to someone?  I bet you can.  And don’t be afraid to go ask for – seek out – a mentor for yourself.)

Work culture cited as a top reason that women leave technology.  How do we change that?  Kimberly says to hire more women.  If there’s a company with its heart in diversity, and there’s isolation in the company still, need to change it from the ground up and from the top down, and to do that, need to get more women into the organization.  (ES: That’s not a complete answer, in my opinion, I think it’s more than just getting more women into a company.  You have to understand what the barrier is – what’s the resistance?  Then, you need to figure out how to change that.  And I don’t know if it’s a one-size-fits-all in terms of the barrier – there might be a huge variety of barriers.)

Input from an attendee: go to local school career fairs and talk about IT.  The issue isn’t having to choose between two candidates, it’s trying to get one qualified candidate.

Jes asks how we can get kids to understand that technology skills are important – they’re not just a degree.  Kimberly – we agree, technology skills provide just one tool in a person’s toolkit.  This is why it’s important to get computer science into school, so then it becomes a tool that they can use as they’re learning science, math, and even in non-science courses.

As women we need to be advocates for each other.  (ES: Agreed, we do.)

Kalen has a challenge to parents: talk to your boys about smart women and how they’re not someone to fear.

One of my mentors, Allen White, stands up to ask a question.  Allen has been in IT for 40 years.  He asks, “What can I focus on so I don’t make “bad” choices, since I am not a female, nor a person of color?”  Kimberly tells him to be cognizant, make his company inclusive, to help someone who’s “different” from him.  He’s done all that :)