10 Tips on how to think like a designer

Presentationzen article on Design thinking.
Same content in context of software architect from 
10 Tips on how Designers think:
(1) Embrace constraints.
(2) Practice restraint.
(3) Adopt the beginner’s mind.
(4) Check your ego at the door.
(5) Focus on the experience of the design.
(6) Become a master storyteller.
(7) Think communication not decoration.
(8) Obsess about ideas not tools.
(9) Clarify your intention.
(10) Sharpen your vision & curiosity and learn from the lessons around you.
(11) Learn all the “rules” and know when and why to break them.

Done-Done Checklist for “User Story” and “End of Iteration” in Scrum – Mind Map Diagrams

Why we need agreement on Definition of Done”(Done-Done) in Agile development with Scrum?
Primary purpose of agile project is to deliver value at the end of each sprint and release.  Agile teams’ members should have common shared understanding of  What is a quality deliverable ?
 
“Definition of Done” for iteration, releases and for every user story or feature makes that shared understanding clear and measurable to all team members.  On papers, it is an audit checklist which drives the quality completion of a agile activity.  The entire process of creating, maintaining, reviewing, updating and making the required process, practice and technology changes in agile team to enable Done-Done helps in establishing right standards and benchmarks for continuous improvements.
I have created a Mind map for listing the “Done-Done” items for User story and End of iteration.  Agile team will brainstorm, identify, evaluate and create their Done-Done checklist. This is to help and guide the team in the process.

Done-Done for ‘User Story’ Mind Map

Done Done for User Story Mind Map

 

Done-Done for ‘End of Iteration’ (Sprint) Mind Map

Done Done for Iteration Mind Map

Done-Done for User Story or Feature Checklist Points

Code Complete
    1. Code integration and merging complete.
    2. Required code refactoring done
    3. Coding, architectural and design  standard \ guidelines met
    4. Design agreed – Design and code reviews done and recorded for implementation and test code. Identified code changes complete
    5. Javadoc for classes and functions complete
    6. Design complete and optimally documented
    7. Unit test written and successfully executed
    8. Code is merged with current version of code repository and checked  with appropriate comments after successful test runs
    9. All Story points addressing functional and Non-functional requirements addressed.
    10. Static code analysis complete. Issues fixed.
    11. All //TODO marked items in code are completed
    12. All known bugs related to user story fixed 
Test Complete
    1. Automated unit and integration tests written and run successfully
    2. All acceptance criteria identified, conversed and discussed collectively, written and tested successfully
    3. Integration test created and executed
    4. Performance testing need identified. Executed (if possible)
    5. Functional testing by QA ( other than the developer ) has been done
    6. Non-functional (performance, reliability, scalability) tests related to user story(or in integration with other user stories) identified and executed.
    7. Pending known defects(due to dependencies etc..) discussed and agreed for action
    8. Code coverage achieved by the automated tests is measured and analysed against the accepted figures
    9. All acceptance tests for user story listed, discussed and tested
    10. All known bugs related to user story  tested.
Approved by Product Owner
    1. All User acceptance testing results approved
    2. Product backlog updated with the completed user story
    3. Product backlog updated with new identified user stories or changed and agreed priorities
Other User Story Done Agreements
1.   Required documentation related to user story is complete
2.   Build and deployment scripts and related configuration files have been updated
3.   Architecture and Design doc \ wiki updated
4.   All team conversations related to User story complete. Agreement from team achieved
5.   All impacts related other pending user stories identified. New priorities (if any) identified and recorded 

Done-Done for Iteration (Sprint) Checklist Points

All User Stories\feature Code Complete
 
1.      Code integration and merging complete.
2.      Required code refactoring done.
3.      Coding, architectural and design standard \ guidelines met.
4.      Design agreed – Design and code reviews done and recorded for implementation and test code. Identified code changes complete .
5.      Javadoc for classes and functions complete
6.      Design for all user stories complete.
7.      Unit test written and successfully executed.
8.      Code is merged with current version of code repository and checked with appropriate comments after test runs.
9.      All Story points addressing functional and Non-functional requirements addressed.
10.  Static code analysis complete. Issues fixed.
11.  All //TODO items in code are completed.
12.  Code is deployment-ready. This means environment-specific settings are extracted from the code base.
13.  All known bugs for the user stories fixed.
Test Complete for all Users Stories and other Iteration items 
1.      Automated unit and integration test written and run successfully for all iteration items.
2.      All Acceptance criteria’s identified, conversed and discussed collectively, Written and tested successfully.
3.      Integration test created and executed.
4.      Deployed to system test environment and passed system tests
5.      Non-functional (performance, reliability, scalability etc..) tests related to user stories( in integration with other user stories) identified and executed.
6.      Functional testing by QA (other than the developer) has been done.
7.      Pending known defects (due to dependencies etc..) discussed and agreed for action.
8.      Code coverage achieved by the automated tests is measured and analyzed against the accepted numbers.
9.      All acceptance tests for user story listed, discussed and tested
10.  Compliance with standards (if any) verified.
11.  All known bugs tested.
12.  Build deployment and testing in staging environment complete.
Design and Architecture Reviews
    1. Architecture and Design enhancements discussed.
    2. Architecture and Design decisions documented.
    3. Stakeholders (Developers, Product Owners, and Architects etc) communicated about the updates.
    4. Impacts of Architecture and Design updates on other user stories identified and recorded.

Sprint Retrospection Complete
 
    1. Sprint retrospection meeting complete.
    2. Progress on last retrospection actions reviewed.
    3. Definition of Done-Done reviewed. Updates (if any) done.
    4. Action plan created out of retrospected items. Responsibilities owned.
Documentation
 
    1. Code coverage reports available.
    2. Required Documentation related to all user stories is complete.
    3. Product backlog updated.
    4. Code reviews records available.
    5. Release notes available.
    6. Architecture and Design documents updated.
    7. Documentation change requests for pending items filed and fixed(if Documentation team member is part of the scrum team).
    8. Project management artifacts reviewed and updated.
Demonstrable
 
    1. Sprint items ready for demonstrations.
    2. Sprint demonstrations for all user stories conducted for stakeholders.
    3. Feedback reviewed and recorded for planned actions – Enhancement change requests or Bugs filed.
    4. Stakeholder (Product Owner in particular) explicitly Signs-Off the Demonstration.
Build and Packaging 
    1. Build scripts and packaging changes fixed, versioned, communicated, implemented, tested and documented.
    2. Build through Continuous integration – Automated build, testing and deployment.
    3. Build release notes available.
    4. Code change log report has been generated from code repository.
Product Owner Tasks
 
    1. User Acceptance tests passed for all user stories. Accepted by Product owner.
    2. Product backlog updated with the all completed user stories.
    3. Newly identified user stories or other Technical debts items updated in the product backlog.

             Other Tasks Complete


    1. All tasks not identified as user stories(Technical Debts)  completed.
    2. IT and Infrastructure updates complete.
    3. Stakeholders communicated about the iteration release details.
    1. All sprint user stories demonstrations reviewed and signed off by product owner.

 

12 Critical problems with product development

12 Critical problems with product development orthodoxy

1) Failure to correctly qualtify economics
2) Blindness to Queues
3) Worship of Efficiency
4) Hostility to Variability
5) Worship of Conformance
6) Institutionalization of Large batch sizes
7) Underutilization of Cadence
8) Managing timelines instead of Queues
9) Absence of WIP constraints
10) Inflexibility
11) Noneconimic flow control
12) Centralized control

Excerpt from Donald Reinertsen’s  book – Principles of Product Development Flow
Good book to understand  why and how lean and agile principles solve product development problems
http://www.celeritaspublishing.com/PDFS/ReinertsenFLOWChap1.pdf

Design Thinking – Drive continuous innovation in Agile teams

Design Thinking Introduction – Approach for driving Innovations in Agile teams?

What is Design thinking?

The term refers to a set of principles, from mindset to process,  practices and approches that can be applied to solve complex problems. It’s a structured approach to generating and developing ideas for continuous innovation.


Article from Tim Brown on Design Thinking

RedHat on Design Thinking

Videos:
http://www.youtube.com/watch?v=UAinLaT42xY

Books:

Change by Design: How Design Thinking Transforms Organizations and Inspires Innovation by Tim Brown (Hardcover – Sep 29, 2009)

Designing for Growth: A Design Thinking Toolkit for Managers 

Design Thinking: Understanding How Designers Think and Work

Agile Testing – 7 Deadly Sins

Wonderful article on Agile testing pitfalls from Brad Swanson is here –The Seven Deadly Sins of Agile Testing


Sin #1: Waterscrumming
Sin #2: Separate QA team
Sin #3: Lack of Test-Driven Development (TDD) and Continuous Integration (CI)
Sin #4: Ignoring Test Failures
Sin #5: Unbalanced testing quadrants
Sin #6: Testing is one sprint behind coding
Sin #7: Separation of Requirements and Tests

International Conference on Agile and Lean Software Methods

International conference on Agile and Lean Software methods concluded on 19th Feb. Overall a great experience.
Scott Ambler’s talk where very informative. There was a good focus on Lean and Agile applied to highly regulated environment. Agile coaching experiences where discussed and Lighting talks and discussion where very helpful. I had a great time. 

WebRTC implementation from Ericsson

Video on WebRTC enabled real-time voice and video communication web application from Ericsson lab.

https://labs.ericsson.com/apis/web-real-time-communication

The API documentation is here – https://labs.ericsson.com/apis/web-real-time-communication/documentation

This is direct browser to browser communication with no intermediate server.

Swarm Intelligence –Can agile teams learn some principles of self-organization from Swarm intelligence?

Swarm intelligenceis the collective behavior of decentralized, self-organized systems, natural or artificial. Swarm intelligent systems are typically made up of a population of simple agents interacting locally with one another and with their environment. The individuals have to follow very simple rules in decentralized control environment. They have to take local actions to optimize the value of their interactions with the environment and other individuals.  This will lead to emergence of “intelligent” global behavior.
Examples: ant colonies, bird flocking, animal herding, bacterial growth, and fish schooling.
Article on Swarm intelligence By Peter Miller in National geographic magazine –
Self-organization:  (Wikipedia definition )is the process where a structure or patternappears in a system without a central authority or external element imposing it through planning. This globally coherent pattern appears from the local interaction of the elements that make up the system, thus the organization is achieved in a way that is parallel (the entire elements act at the same time) and distributed (no element is a central coordinator).
 Understanding software development is complexity science problem. Complex behavior of agile team is difficult to understand. Agile software development teams can learn from understanding collective intelligence in nature.  Very complex behaviors can be coordinated by relatively simple interactions enabling emergent behaviors which cannot be created by individuals.
So can we develop swarm intelligence in agile teams?  Yes, we can but one important difference we have is that development teams are composed of knowledge workers not dumb ants and bees.  Knowledge workers can see the whole and relate how their individual actions can create something bigger than sum of their individual efforts (Systems thinking). Swarm intelligence as theory still applies but in a different way. Self-organized agile team requires a context, a goal and carefully crafted constraints to align purpose and efforts in right direction. 
Study of Swarm intelligence and theory of Self-organization in agile teams can help agile leaders in better understanding of agile principles and practices.