Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Engineering Technical/Peer Review - Best in Class Methodology?

Options
  • 29-01-2011 11:33pm
    #1
    Registered Users Posts: 1,260 ✭✭✭


    Hi All,

    I am looking for some input from members with experience in participating or creating Engineering Technical/Peer Review processes.

    I am reviewing our current process as I believe that we could be getting more effective review from the investment in time and energy that we are putting into reviewing our work.

    Process Steps:

    • Design Complete
    • Review Team Selected (4 Mandatory 2 x QA 2 x ENG, + Optional Functions)
    • Design Deliverable Material Distributed (5 Days Prior to Review)
    • Review Held (Initial Physical Review Meeting for On-Site, Teleconference Offsite, Follow up with Non-attendees via email)
    • Action Item closure
    • Review Closure



    In particular, I am interested in how others conduct Technical/Peer Reviews as part of their Engineering Change management processes?

    Key Questions:

    • Who are the mandatory/optional participants, are roles assigned?
    • Does line-management attend the reviews?
    • How do you run the review, physical meeting, teleconference, pass-around, desk review...etc.
    • How is the review material distributed, hard copy, email, sharepoint?
    • How do you handle approvals, action items, re-reviews, go-no-go?
    • Do you measure the effectiveness of your reviews and if so how?




    Do you have a best-in-class system in place?

    If so I would appreciate it if you could share with me some of your experiences in relation to conducting effective and timely reviews of engineering changes and projects.

    (P.M. Welcome from those with expertise or processes to share)

    I will post a copy of the final process to all that contribute positively to it's creation.

    Regards;

    Derek


Comments

  • Registered Users Posts: 1,260 ✭✭✭Irish_Elect_Eng


    Note:

    1 x Design QA
    1 x Mfg QA


  • Registered Users Posts: 1,260 ✭✭✭Irish_Elect_Eng


    Off-line input would be appreciated also.

    The industry is the medical device sector.


  • Registered Users Posts: 1,622 ✭✭✭Turbulent Bill


    For business reasons this will be pretty general, but should give an overview of a process. Not in the medical device industry myself.

    A product development programme has a dedicated team assigned to it, with at least one specialist from each area (e.g., design, QA, manufacturing etc. in your case). The complete development cycle is broken down into 3/4 milestones, with specific deliverables at each milestone which must be reviewed and approved. Mandatory and optional reviewers are assigned for each deliverable at each design stage, with oversight from management etc. and other groups as needed.

    All review material is placed in a controlled network location, and comments from reviewers are put in the same location before the sit-down review takes place. The key thing is that only the comments are reviewed at the meeting, not the actual deliverables, which keeps things to the point. Comments are accepted or rejected, with detailed follow-up after the meeting.

    Of the various systems I've used, this is by far the most effective.


  • Registered Users Posts: 1,260 ✭✭✭Irish_Elect_Eng


    For business reasons this will be pretty general, but should give an overview of a process. Not in the medical device industry myself.

    A product development programme has a dedicated team assigned to it, with at least one specialist from each area (e.g., design, QA, manufacturing etc. in your case). The complete development cycle is broken down into 3/4 milestones, with specific deliverables at each milestone which must be reviewed and approved. Mandatory and optional reviewers are assigned for each deliverable at each design stage, with oversight from management etc. and other groups as needed.

    All review material is placed in a controlled network location, and comments from reviewers are put in the same location before the sit-down review takes place. The key thing is that only the comments are reviewed at the meeting, not the actual deliverables, which keeps things to the point. Comments are accepted or rejected, with detailed follow-up after the meeting.

    Of the various systems I've used, this is by far the most effective.

    Thank you for your input....

    Stage reviews, as you have described are a good idea, with defined deliverables at each stage, planning, feasibility, development, qualification, etc...Given the variety of our projects some projects would only include a sub-set of the stages, but that is workable.

    Team selection is critical, as is functional management buy-in to comitt their resources to the project....this can be a sticking point depending on the project pay-back to the various stakeholders...

    "only the comments are reviewed at the meeting" This is an interesting comment that I had not considered. Our practice would have been to go through the review material at the meeting, but this allows reviewers the option of not reviewing the material prior to the meeting in the knowledge that it will be reviewed during the meeting, your concept puts the responsibility squarely on the reviewers to pre-review and to bring their input to the meeting...I will have to think about this as the concept will meet with resistance from QA I think...But I like the personal accountability that it brings to the review process...

    I was considering using Sharepoint for on-line document review, has anyone had experience of using Sharepoint for this purpose?


  • Registered Users Posts: 1,622 ✭✭✭Turbulent Bill


    Thank you for your input....

    Stage reviews, as you have described are a good idea, with defined deliverables at each stage, planning, feasibility, development, qualification, etc...Given the variety of our projects some projects would only include a sub-set of the stages, but that is workable.

    Team selection is critical, as is functional management buy-in to comitt their resources to the project....this can be a sticking point depending on the project pay-back to the various stakeholders...

    "only the comments are reviewed at the meeting" This is an interesting comment that I had not considered. Our practice would have been to go through the review material at the meeting, but this allows reviewers the option of not reviewing the material prior to the meeting in the knowledge that it will be reviewed during the meeting, your concept puts the responsibility squarely on the reviewers to pre-review and to bring their input to the meeting...I will have to think about this as the concept will meet with resistance from QA I think...But I like the personal accountability that it brings to the review process...

    I was considering using Sharepoint for on-line document review, has anyone had experience of using Sharepoint for this purpose?

    Dedicating people to specific project teams (rather than their functional groups) needs to be pushed by senior management; functional managers naturally won't want to lose control of their staff. It's worth noting that the team members can change for different project stages as they're required.

    Reviewing just the comments is especially important where the deliverables are large; having 10-12 people in a room reviewing a 150-page document would be nuts! As a mandatory reviewer you must supply comments using a common template (even to say you have no comments, or don't have a clue what something means), and these have to be submitted at least a few days before the review. The project leader then collates these, and usually common issues emerge. The review deliverables are also available at the review to refer back to if needed, but it's the reviewers' responsibility to be familiar with these before it starts.

    I've never been a big fan of Sharepoint. We use a variety of open-source revision control tools, with the comments submitted using Excel sheets.


  • Advertisement
  • Registered Users Posts: 1,260 ✭✭✭Irish_Elect_Eng


    Dedicating people to specific project teams (rather than their functional groups) needs to be pushed by senior management; functional managers naturally won't want to lose control of their staff. It's worth noting that the team members can change for different project stages as they're required.

    Part of the difficulty is often gaining that commitment from the different functional groups, and protecting the project team from the every-day firefighting that can pull them back from the project, particularly where there is an unbalanced cost/risk/benefit between some of the groups in completing the project. I believe that any change to the review process will require 100% commitment from upper and functional management.

    I found this description of differnt types of review, I was thinking of specifying a couple of different types of acceptable review method, to ensure that the review process was not cumbersome and reflected the required level of review.

    "A quality-driven organization will practice a variety of peer review methods, spanning a spectrum of formality, rigor, effectiveness, and cost. Let’s look at descriptions of some common review approaches.

    • An inspection is the most systematic and rigorous type of peer review. Inspection follows a well-defined multistage process with specific roles assigned to individual participants. Inspections are more effective at finding defects than are informal reviews. For example, inspections held on Motorola’s Iridium project detected 80% of the defects present, whereas less formal reviews discovered only 60% of the defects [2].
    • Team reviews are a type of "inspection-lite," being planned and structured but less formal and less rigorous than inspections. Typically, the overview and follow-up inspection stages are simplified or omitted, and some participant roles may be combined (e.g., moderator and reader).
    • A walkthrough is an informal review in which the work product’s author describes it to some colleagues and solicits comments. Walkthroughs differ significantly from inspections because the author takes the dominant role; other specific review roles are usually not defined. Walkthroughs are informal because they typically do not follow a defined procedure, do not specify exit criteria, require no management reporting, and generate no metrics.
    • In pair programming, two developers work on the same program simultaneously at a single workstation, continuously reviewing their joint work. Pair programming lacks the outside perspective of someone who is not personally attached to the code that a formal review brings.
    • In a peer deskcheck, only one person besides the author examines the work product. A peer deskcheck typically is an informal review, although the reviewer could employ defect checklists and specific analysis methods to increase effectiveness.
    • A passaround is a multiple, concurrent peer deskcheck, in which several people are invited to provide comments. The passaround mitigates two major risks of a peer deskcheck: the reviewer failing to provide timely feedback and the reviewer doing a poor job."


  • Registered Users Posts: 1,260 ✭✭✭Irish_Elect_Eng


    I have got some good feedback by PM. Thanks Guys.

    Does anybody have a good recommendation for a book on the subject?


  • Registered Users Posts: 1,260 ✭✭✭Irish_Elect_Eng


    Has anyone got any experience of using the IEEE Software Peer review Process ?


Advertisement