You went to a conference… what’s next?

Conferences are naturally a great place to get some new info and insights into techniques and solutions to problems. They also serve to validate and critique our understandings and assumptions on how we work and the choices we are making in that context. But the learning and inspiration that goes on at the event needs to be put into practice also, after the event. But how?

I helped organize the first DevOpsDays Portugal that was run at the beginning of this week. During and after the event there were plenty of discussions going on, related to the topics presented, specific techniques and ideas, and naturally the sharing that went on in the open spaces.

This is obviously great, but it should not end when the event is over. One thing I think is very important is that you find ways to bring back what you have learned into the workplace so that the knowledge gained scales and the environment around you can adapt to your new and useful ideas.

So I’ve come up with a short list of ideas you can use, especially if you had gone to DevOpsDays Portugal specifically, though most of these will apply to any conference you go to.

  • Write a blog post on what you have learned – Write out your thoughts and ideas, things locked in your mind from the event. What did you find useful? What do you disagree with and why (because you can choose to agree and disagree)? What suprised you? What do you think you can apply to what you are working on? Getting this written down is a great way to scale the sharing aspect of what your thoughts are. If you have many ideas, consider creating many small posts to aid sharing and discussion.
  • Create a search list – Acronyms you saw or heard of, pattern names, technique names, component type names and so called “buzzwords” can be useful for this. Which ones do you know and don’t know about? The ones you know about can be explored further for deeper knowledge; the ones you don’t know can be searched for new insights. This is a great way to broaden knowledge in areas you are not generally aware of. Just don’t stick with not knowing about something.
  • Summarize and present internally – Create a short presentation (something like 10-15 minutes) sharing the topics you found interesting and how you think they could be explored at your workplace or your team. Share the vocabulary you found if that will help people search for more, deeper info on what the ideas are, and promote the learning around you.
  • Use the presentations for lunch and learns – videos from the event are not out yet, but you could start to think about which talks would make sense and bring more value to your current team. Once available set up a session with pizza or just normal lunch to (re)watch as a team and discuss afterwards.
  • Run an open space internally – now that you have experienced the open space, you can run one internally with your team, and focus on issues and ideas you learned about at the conf or just general issues your team would like to bring to the table. You probably won’t need to have as many slots, since your team will probably (and hopefully) be smaller then the attendance at the conference. make this a part of your Kaizen initiatives.
  • Compare where you are with what was presented – run some mapping sessions and round tables that would allow you to see what is your current situation is and compare it to some that where presented. Are you blameless and getting value from your postmortems? What’s your on-call like compared to what was presented? How about your build pipelines and architectures? Is zero-downtime a thing you could reach as did the presenter? Hows your testing strategy and how right or left is it in the pipeline? What about team organization? There’s alot you can (critically) compare, and help map out what are your capabilities at the moment, and which ones you could improve on.
  • Use the ideas for a hackathon – was something mentioned and built that could also help your org? Could a hackathon help spread the idea and get it implemented / tested in your org? Is there a library of tool your team should test out in a proof of concept to see if it would improve your situation?
  • Share sponsor materials to spark discussions – You may have collected a bunch of panflets and saw some of our sponsor tools. Dont’ throuw them away. Share them, not necessarily for purchasing (since many of us don’t actually have that control), but at least to check out whats available and what aspects solve issues you have and map to what you learn. When colleagues are checking them out, try to mention some of the ideas and definitions you learned about and how they could help your org. Discuss how to bring those solutions into your work, whether through the tool from that vendor or other means. Also, how do their solutions differ from or improve on what you have at the moment?

All in all, what I personally think is important is that the event serves as a spark in the learning process and improvement of your work experience and work outcomes. After the event is the time to start to do something to continue improving and continue the learning journey with the many new ideas you have captured.

This is obviously a short list, and there is so much more that can be done. What are your ideas to keep the learning going?

DevOpsDays Portugal is coming in 2019!

I’m REALLY excited to be a part of this! The very first DevOpsDays Portugal is going to happen, June 3rd and 4th! The main DevOps-related meetup communities in Portugal (DevOps Lisbon and DevOps Porto) have come together to put together a program we hope we will be able to repeat throughout the following years. Our first event will be in Lisbon and we hope to rotate through the cities each year.

Last year, many of us helped create a DevOps track at the AgilePT 2018 conference, which was a great learning experience for us. We realized that, with the interest that atendees showed at the time, that there was room for a dedicated event on the topic. We decided to give this conf a try, with the help of the Agile PT team. The DevOpsDays format, with talks, ignites and open-spaces is an excellent format for learning and interacting with the community of practice, and it’ll be great to have this type of event here, finally!

Speaking of AgilePT, this year’s conference’s (in Porto, May 31st) is its 10th aniversary and a special program is being planned. Be sure to follow the site and social media for news about it as it comes out!. Both of the conferences will be seperated over the weekend, so if you are planning on attending from abroad, the events will provide a great opportunity to discover both cities (Porto and Lisbon) and visit the country on the weekend!

The CFP is open!

We are still working on the content lineup and we have a Call for Presentations open on PaperCall . We are considering a set of 30 minute talks and 5 minute Ignite talks as is usual at most DevOpsDays events. For the longer talks, we are really interested in curating them towards real-world experience reports with DevOps. We will also have open spaces, as we know that is one of the best learning experiences DevOpsDays events offer. If you have a great idea for a talk, please do submit it. We hope to have a good mix of national and international stories, and hopefully a lot of learning throughout both days.

Sponsors!

The call for sponsors is also open at the moment. We know the job market here in Portugal is very active so this event will, for sure, be a great way to get in contact and network with the local community of practice. It’s also a great way to show that your company is devoted to this mindset. We have setup an interesting set of options that can fit companies and brands of all sizes. Feel free to inquiry us about any of the options, too.

Tickets

You can already reserve your tickets for the event, and we are offering early-bird pricing at a very accessible 60€ until March 15th. Afterwards, price goes up, but still to a very accessible value. Go to https://www.devopsdays.org/events/2019-portugal/registration/ to get your’s today!

Follow us!

We’ve still got a lot of work going forward on this, but we sure are committed to creating a great event. Be sure to follow us on our DevOpsDaysPT twitter account to get all the news as it comes out. We will be using most of the regular social media channels, as you would expect.

Hope to see you at the event!

Things Worth Watching #5 – Pacts to the Rescue (Beth Skurrie)

Beth Skurrie‘s “Pacts to The Rescue” presentation at Sydney API Days 2015 is worth the watch to get an overview of Consumer-Driven-Contracts and the use of Pact to implement it. One thing I was missing, and that I hadn’t seen yet till watching this,  was the Pact-Broker – a centralized service to publish pacts to and consume them in the service providers. A .Net implementation of the broker would also be nice, just to simplify integration, modification and ramp-up. I don’t know of any implementation, but at least there is Pact-net to use pacts in a .Net codebase, provided by SEEK-jobs.

video: http://www.infoq.com/presentations/pact

slides: http://pt.slideshare.net/bethesque/pact-44565612

Visual Studio Solution and Package Versioning – Part 1

Previously, I wrote in my previous post “SemVer, Team Development and Visual Studio Solutions” some of my thoughts about solution and application versioning. It’s been nearly two months since I wrote it, and after having followed the strategy I outlined, I can now discuss my experience with it. My team and I have felt some of the pains associated to it. Just like when I wrote it I was looking for a solution to my problem at the time, I continue to search for fixes to other problems that have arisen.

I understand that that post is kind of TL;DR but it pretty much states what my problem was and still is, and a possible path. At this moment, I still think that what was presented is still valid and works. The problems that I have encountered are essentially about tooling and concurrency. NuGet can be a REAL pain in many ways, if you don’t understand how it works.

What I want to do now is go over the pain point’s I have felt and try to find solutions for them. Hopefully, this will help me get my head around what I really need, since so many ideas are going through my mind. Hopefully, by writing this, I can get my mind straightened out, and help others to do the same.

The “solution” I used…

The changes I applied, based on the previously mentioned post, was used across 3 VS solutions with 10, 60 and 90 projects each (to help understand the dimension of the problem). The most generic solution contained shared libraries and had the least number of projects; the others refer to different application components.

Constraints were applied to simplify some aspects of the process:

  • Any project that would have shareable library outputs were packaged into NuGet packages. Package generation was aided by OctoPack, applied through NuGet.
  • Two NuGet repositories were used: a local repository on the developer’s machine – a dev could at any moment create a package and consume it without affecting other devs efforts during development; a second feed is used for CI built and published packages. These are generally considered official.
  • NuGet configuration was contained in the solution, by use of nuget.config file in the .nuget solution folder. LocalRepo folder was a relative path \ sibling directory to the solution folder. This way, pulling from the repo would be enough to get dev machines to connect to the repos.
  • SemVer was used as .Net assemblies permit, using only the first 3 parts (major, minor and patch). Revision number was generally 0, except when we needed to regenerate a package. I seriously wanted to use SemVer because it makes so much sense, but there are problems with that, which I hope to highlight shortly.
  • To simplify package management and dependency control within a solution, ALL projects shared the same version number, stored in the project’s AssemblyVersion.cs file, and was updated using Zero29.
  • Because we used OctoPack, assembly and package version numbers were identical to the assembly versions and consistent throughout the solution.
  • EVERY packaged project has a nuspec file, to enforce file inclusion, dependency requirements, and metadata (except for version number), Dependencies can be VERY problematic and correct care must be taken to get this up to date consistently.
  • A “tools” solution folder was added to each solution, with a “BuildAndPack” folder. This folder contained a series of .bat files to simplify some steps, such as version increment tasks, nuspec file management, and inter solution package updates (which I’ll get to shortly).

Devs became responsible for controlling the versioning process. Any time a change was made in a solution, the dev would need to increment the version number, build and pack locally. With these local packages, he could update the downstream projects with the changes in the dependency packages and validate the integration. This would allow him to test his changes and effects in the dowstream project (which in many ways commands changes in upstream projects).

Still, we tried our best to simplify and automate each step, to minimize the number of decision and manual steps required. Also, there was a strong attempt to keep the solution self contained, in order for any member to get all the necessary tooling components and scripts on a pull from source control.

Evolution

The solution in place isn’t tuned yet to what we as a team require. It’s a start, but needs many fixes and solutions to the many pain points that are still felt. I relate them mainly to tooling and change concurrency.

I mention tooling, because NuGet has a set of default behaviors that are somewhat problematic. For instance, consider:

  • a project A that consumes a package B and C.
  • B defines C as a dependency, and says in it’s nuspec to work with any version of C from 1.0 to 2.
  • project A currently consumes v1.0 of B and C.
  • A new version of B and C exists at v1.1.

If you update package B in project A, you would probably expect C to get updated. Unfortunately, it isn’t. A already has a version of C, and sticks with the lowest version available of C. You would end up with B at 1.1 and C at 1.0. Two ways to get C up to date are to update C explicitly or change B’s nuspec to demand C at 1.1 as a minimum. NuGet doesn’t support this out of the box and so I had to develop a tool to iterate the nuspecs in a solution and update minimum package versions, especially on dependencies within a solution, because all packages generated from a solution will have the same version.

Concurrency-wise, when 2 devs update the version of the assemblies within a solution, every AssemblyInfo.cs file is changed on the same lines, and concurrent changes generate merge conflicts in source control. If changes are infrequent, it is more or less quick and easy; if they are frequent, devs get stuck in merge resolution. Same thing happens to .csproj files, since the path to the package contains the version number.

Series of Problems and Solutions

I want to tackle these problems in a series of posts, finding workarounds or solutions for them. As they get written I’ll add the links to the list:

  • Version increments and minimizing merge conflicts
  • Keeping intra-solution package dependencies aligned
  • Keeping inter-solution package dependencies aligned
  • Keeping things aligned in a NuGet package update

I’ll also try to keep the tool set used in a GitHub repo @ https://github.com/MiguelAlho/NuSpecDependencyUpdater and eventually as NuGet packages (once I figure out how the public feed works).