Recordings
Jan 27, 2023
New Year, New NEXT π
Read more
With NEXT, bringing the customer into every product decision has never been easier. We're starting 2023 with some major product announcements to help you double down on the customer:
Projects are now Teamspaces and better reflect the continuous nature of discovery work.
We're introducing Storiesβstitch together highlights into digestible videos that you can share everywhere.
New navigation elevates Recordings, Highlights, and Stories for easier access.
In other news: highlights now have reels, transcripts are 10x faster/better, transcripts offer karaoke mode and a floating player, and tons more.
Recordings
Nov 20, 2022
Karaoke mode is here π€
Read more
Following along the transcript while listening to your audio or video recording has never been easier! We're introducing karaoke mode, which marks every word in the transcript as you're listening to a recording. Karaoke mode is especially helpful while your highlighting and trying to clip very specific pieces of your interview.
Besides following along, karaoke mode also allows you to control the video via the transcript. You can now click on any word in the transcript and the video or audio recording will jump to that part of the recording instantly β no more scrubbing frantically scrubbing in the scroll bar or video player!
Recordings
Nov 18, 2022
Announcing AI-based sentiment analysis π€©
Read more
NEXT is announcing AI-based sentiment analysis for audio- and video transcriptions.
From now on, NEXT helps you identify positive or negative emotions in your customer interactions faster than ever before with AI-based sentiment analysis on the speaker level.
Apps that offer sentiment analysis usually analyze the sentiment of an entire interview. Although this may be quite accurate, it doesn't quite capture the varying sentiments for different parts of an interview or usability test. At NEXT, we do things a little different. Instead of analyzing the whole document, we analyze sentiment for each speaker section in your transcript. This higher level of detailed analysis allows for a much better identification of unmet customer needs and aspirations in the interview.
Each answer to a question you asked during an interview is analyzed, broken down and tagged with either positive or negative. The algorithm picks up words and word combinations, calculates positive vs. negative, and surfaces those tags that are >90% likely to be positive/negative with a smiley in the transcript.
Sentiment analysis becomes very useful when synthesizing large amounts of unstructured data like interviews and helps you create shareable highlights and stories faster than ever before
Go to NEXT, upload a recording of a customer interaction and try it out!
Recordings
Oct 20, 2022
Intuitively add, change & rename speakers βοΈ
Read more
Naming speakers in a transcript makes it much easier to follow and interpret the transcript. We just made renaming speakers a lot more intuitive. From now on, you can simply click on a speaker in the transcript (e.g. "Speaker 1") and rename it everywhere at once.
You can also re-assign speakers or add new speakers in case the transcription algorithm didn't pick up all speakers from an interview.
Highlights
Sep 29, 2022
@-mention your teammates to share highlights π
Read more
Sharing highlights with your teammates just became a whole lot easier. From now on, simply @-mention someone in who's part of your team in the highlight description. NEXT will automatically send the highlight via email to the person you mentioned. To view the highlight, click "View highlight" in the email.
Sharing highlights helps share relevant insights with your team, share important findings that require a conversation, and build alignmentπ
Highlights
Jul 12, 2022
Add context to your highlights with descriptions π¬
Read more
Over the past months, our users continuously shared with us the need to describe, share and discuss highlights they create in their audio- and video transcripts. They taught us: "Highlights aren't the interpretation of my work, but merely the marking of important text that needs to be interpreted, described, shared, and discussed with the team β this will increase our shared understanding of our users and customers."
We've heard you! That's why today, we're introducing a new concept to highlights, which includes the ability to:
Describe your highlights
You can now add meaningful context to the highlights you create, summarize them, and add your personal interpretation. This augments the highlights and makes them ready to share and be consumed by others.Tag highlights in context by typing "#" anywhere in the description
Tagging serves many purposes, such as making highlights easy to find across multiple interviews and creating clusters of relevant topics you're working on. You can now use tags in the context of a description or interpretation of a highlight to give them more meaning.Quickly see who created a highlight when
To initiate a conversation with your team on what you discovered, it's important to know who created a highlight. We now show you a timestamp on the highlight, as well as the name of the creator β take collaboration to the next level.Seamlessly add highlights as evidence to another note, or create a new note
A lot of the work teams do happens inside a note β that is, capture and interpret a user interaction. We just brought the entire value chain of NEXT (capture > interpret > share) inside a single note. Simply click the "..." button on a highlight and choose to add it as evidence to an existing note (e.g. an assumption), or create a new note as to start a cluster of highlights (e.g. an insight).
Recordings
Apr 19, 2022
Improve transcription accuracy with your own vocabulary π
Read more
Many organizations or teams have their own abbreviations, jargon, and terminology. This team- or organization-specific language is sometimes hard for the transcription algorithm to get right.
To improve accuracy and quality of transcripts, even in the toughest of situations, we're introducing the option to add your own vocabulary on a project-level. Simply click the arrow next to your project's name in the header of your project and click "Transcription vocabulary". Here you can add words, abbreviations, jargon and more, which will skew the odds of getting them transcribed correctly in your favor β¨
Highlights
Apr 7, 2022
Jira and GitHub integrations are here β¨
Read more
Product discovery helps teams make better, more informed product decisions and provides the necessary input to build products users love. Many teams who use NEXT for discovery are part of or work closely together with the development team. For the development workflow, these teams typically uses specialized tools such as Jira or GitHub.
From now on, you can easily close the discovery-to-development loop, by feeding all your discovery data directly into the development workflow. Simply create Jira or GitHub issues from a note or highlight in NEXT by clicking the "Share" button. This will add the content of the note or highlight into the issue, which serves as context for the delivery team.
Use cases
There are many use cases for the NEXT x GitHub/Jira integrations. Here are a few to get inspired:
Create draft user stories in NEXT based on customer insights (e.g. highlights) and create Jira/GitHub issues from these stories with a single click
Found something interesting in a customer interview? Create a highlight and then create an issues in Jira/GitHub as a follow-up action
Want to know more? Take a look at our Help Center or get in touch!
Highlights
Mar 2, 2022
Share deep links to highlights β¨
Read more
Highlights are extremely relevant pieces of data that often need to be shared with others in your team. However, sharing just the highlight is not enough. To fully understand the highlight, you'll need the context of the transcript in which it was found.
From now on, you can easily copy a deep link to a highlight (either from within a note or from the highlights lister) and share it with anyone. When sharing, it will create a preview of the highlight if the sharing medium allows (for example in MS Teams or Slack). Once someone in your team clicks it, the note in which the highlight was created will open and it will scroll down to the place in the text where the highlight lives.
Boards
Jan 20, 2022
Real-time collaboration is here! π
Read more
Collaborate in real-time with your team on the same note, co-create, co-highlight, and more!
Product discovery involves lots of collaboration, both in real-time and asynchronous. From now on, you can collaborate in real-time in any note in NEXT. Go into a note with your team at the same time to highlight & tag, take notes, and more! Now smoother and faster than ever before π
On top of real-time collaboration, your data is saved in real-time as well! No need to worry about saving notes when leaving your computer for a while β your data will be saved, safe & secureπ