Protocol for collective research with consistency and flexibility
Compare NVivo, MAXQDA and ATLAS.ti to code transcripts from teams or zoom
There are various software tools and cloud-based applications to support individual research such as ATLAS.ti; Dedoose; MAXqda; NVivo and NVivo 9 Server; QDA Miner; and Transana... However, the big challenge is to find a good platform for collective research.
A few solutions proposed by teams are: 1. Merging software projects and the completed work, after working individually.
2. Working in serial and exporting work
3. Synchronous working by multiple users
Annotate PDFs Collaboratively Using Google Drive
Upload a PDF to your Google Drive (New>File upload, or click-and-drag the PDF into your Google Drive).
Click the PDF to preview it.
(Optional) Click the share button in the upper right to add other people, or get a link to share.
Click on the annotate icon in the upper right to start adding notes. Highlight text or illustrations throughout the document to comment on them.
The ability to annotate PDFs, or at least open in a PDF reader that will, without having to download the file, annotate, and upload.
Google, Asana, Trello
2.Ways to collaborate in NVivo
Many projects involve multiple researchers working together—NVivo provides two ways to approach collaboration:
Share projects using NVivo Server—this is the best solution for team work since everyone in your team can work on the same project at the same time. They can code, annotate and link source content and have immediate access to the changes made by other team members.
Work in copies of a standalone project and merge them into a master project at appropriate intervals—making use of user profiles to track changes.
While teams offer higher productivity and a richer perspective, they also present a number of management challenges. Early in a project it is important to determine the approach your team will take to:
Collecting and organizing data
Creating and cataloguing themes and topics (the node structure)
Coding the data
Whether you work with NVivo Server or collaborate in a standalone project you might want to consider the following:
Appoint a team leader who will keep the team on track and make final coding decisions.
Have regular team meetings to discuss interpretations, address issues and assign tasks—record the outcomes in a memo.
Have each team member keep a memo to record their progress, including any hunches, suggestions or questions—you could also do this in a single ‘teamwork journal’.
Early on, have multiple team members code the same collection of sources, then compare coding (using coding stripes or a Coding Comparison query)—this can help ensure a consistent approach.
To start with, make a node hierarchy for each team member. After team discussion, you can refine, merge and reorganize.
Aim for a clear node structure and use descriptions (in node properties) to make the purpose of a node clear for all team members.
To help team members understand the meaning of nodes, create a codebook that lists the nodes and their descriptions—refer to Export a codebook for more information.
As the project progresses, see which nodes have been created or modified and by which team member—do this in Node List View or by running a Node Summary report.
While a common node structure is important for efficiency and reliability— it should remain flexible so that new insights and exciting ideas are not lost.
4. Coding together
If multiple researchers are coding the same data, you may be interested in the consistency of their coding. NVivo provides a number of ways to check consistency or coder reliability:
Run a Coding Comparison Query to determine the percentage of agreement and disagreement between coders.
Display coding stripes for users—you can open a data source and see the coding done by each researcher.
Filter the content of a node to see only the references coded by selected researchers
Remember that inconsistency in coding is not necessarily negative— it may prompt productive debate and deeper insights into the data.