top of page


I chose to use a survey in addition to my interviews from part one. The interviews gave me a good range of insights and challenges that my target demographic faces. However, I needed to get a deeper level of understanding in a range of areas given that the interviews had a strong work-life balance focus and time dedicated to whakawhanaungatanga. Areas I particularly wanted to understand more about, included the participants' approaches and attitudes to digital and technology in their businesses and what their main pain points were. 

In this section, I will talk about the process I used to create the survey, get responses and review insights. The  data and findings section discusses the findings which you can link to here

In terms of using the survey, I found it valuable as an overall exercise to gain user insights, and learn about the process to make it more effective and efficient for me to use in the future. I explored a new tool which was a useful process. It did take longer than I anticipated from the end-to-end process, and I probably should have focused less on getting the information and questions highly refined as I probably went to a degree that was unnecessary. 

Writing the Survey  



Before writing the survey, I reviewed our previous sessions that were relevant to this topic in regard to gaining user insights. Most notably the sessions with Sarah Coddington and Melissa Wragge. I also asked for advice on the TFL Slack channel and mahi tahi which was helpful in getting recent and relevant experience from other students. I reviewed a number of the survey tool sites eg 


Writing the Survey 

I wrote a document with the key points/likely questions I would ask to be clear on all the information I wanted to include. This was quite long so I prioritised by thinking about what information I would actually be able to do respond to in some way or what I needed to inform my personas, rather than just what might be interesting. I reviewed other surveys that had overlapped subject matter and questions - most notably the MBIE's Understanding the Digital Capability of New Zealand Businesses,

in which they had interviewed 2280 New Zealand businesses (p. 7). If any insights were contained in this report, then I either excluded them from the survey or used similar categories, so it was relevant to compare my own results if so desired. 

Preparing to input the questions into the tool 

I created an excel document with the survey questions. I formatted it in sections defined based either on my research questions and sub-questions, for example, work-life balance,  or to explore themes that came up during the interviews to test with increased data, for example, reasons they got into the business. I designed each question with the relevant scales (eg likert, multi choice or open text) I wanted to use to help select the correct tool, understand the conditional flow required, and easily share with others for feedback. I also used the lens of how the data would be presented so I could get the scales etc correct to enable analysis. I also used demographic questions to define the specific user group my research is targeting (female small business owners with at least one child under 14) and all others. 

What was I looking for

I tried a number of different online survey tools that included google forms, survey monkey, Typeform and jotform. Factors I considered were ease of use to input into tool and for users,  ability to configure the survey to add to make it more engaging. I explored using the paid audience option to increase my rates of participation. I experimented with Survey Monkey audience, however, I could not configure the specific group I wanted to engage. It was also quite expensive so chose not to pursue it. It's good to have explored it and understood that option for future ventures. 


Tool Selection 



What did I select

I selected Typeform because it was engaging and personalisable. It was easy to use and create conditional questioning. In addition, I wanted to demonstrate to my users that there were different tools available rather than the standard go-to of survey monkey and google forms. ​There were also additional functions to Typeform that could be useful for other ventures in the future such as paying an additional fee to review drop-off rates etc. This would be useful in particular for consumer surveys. 


I did find it a lot longer to enter the survey into the tool than I anticipated. Initially, it was learning the new platform, but it was It’s a lot longer than I anticipated.  I thought it would be. Will have to review what is most important to ask

Sharing and Getting Responses

Getting feedback 

​Once I had the survey in Typeform, I shared the draft with my advisor Felix and my cohort to get some feedback which was helpful in refining it. After working on it for so long, I found it quite hard to share publicly. Once I got over that it was really helpful and a relief to be getting input.


Launching the survey

I decided to share the survey across a range of social media platforms and forums that I belong to. To create an engaging post that would invite people in and who want to fill out the survey, I created this tree visually. This explained what the survey was for, why I was doing it and an estimation of how long it would take. (which appeared to be about 5 minutes short once people were completing it). I always like to add an element of myself into the drawings and the reason I am asking people to do something. I believe this aligns with the kaupapa Māori values of whakawhanaungatanga ( building relationships) and koha (giving ad receiving). 


Untitled_Artwork 4.jpg


Where did I share it?

I shared the survey on several different online social media platforms and in forums to attempt to connect with a wide range of consumers. I included a post that asked if people reading it that were not business owners if they could share it with others, as when I shared my post for interview participants earlier in the year I got quite a few responses through referrals. 

Channels  included

  • Linked in on both my personal page and in the New Zealand Small Business forum as well.    

  • Facebook on my own feed as well as in the "She Owns It" forum that I have joined which is for female business owners, and "Women Tech Founders NZ"

  • My Instagram, at this point it was just my personal one and not the Insta page I had set up for my test business.

  • Slack - TFL slack channels and other slack groups I am part of for small businesses.

I acknowledge that anchoring on my own channels may have provided a slight bias on the type of respondents that I got as I used only digital platforms. Given that I am likely to be more interested in early adopters I was comfortable with this. ​​

Untitled_Artwork 3.jpg

Increasing my response rate

Responses were slow to start, with just a handful. These increased and doubled over the week to around 10. At this point, I decided to try another graphic as I had some feedback that the earlier one may have been a little confusing. So I produced this one to the left and shared that on Linked in given the large user base. I also used it on my website as an anchor for my research info page.  One thing that had been useful from the initial surveys was the time stamp when people had filled out the survey. There had been two clear patterns, either early in the morning (around 7am) or later in the evening (around 9-10) so I timed the post for the evening window which would then also be there in the morning for the early window.

Opportunities missed

I feel I missed an opportunity in terms of finding out where the responses came from. I could have put a question in the survey in regard to where they put the link or could have explored how I embedded a UTM code in the survey to see where they entered the survey site from. This would have been good to understand where my users mainly engage. There may have been the opportunity to test the engagement levels of the different graphics as an A and B test, Both may be things I explore in the future. ​


Starting the analysis

When I initially started to analyse the responses, I had 12 usable surveys. Eight in my target demographic, and the other 6 provided a contrast. I was conscious of the small sample size in terms of determining my results, but I felt I could not afford the time to invest in getting significantly more respondents. One of the important things for me was to understand the end-to-end survey process,  everything from writing it, pitching it, and reviewing it through to what do I need to know for personas etc. so I can use this again in the future. Given these factors, I just took into account the fact that there was not a large number of responses in regard to how strongly I reviewed the insights as part of my project.  Although there was quite a strong alignment with the results as I started to review so, I was relatively comfortable moving on.  If I were to be using this for a business idea, I would seek to gain additional respondents. 



How did I analyse the data
My instinct was to review the data manually so that I could really engage in the answers and formulate insights. I did want to check in with experts about whether there was a smarter way to do this using technology first. I check with Sarah Coddington and a friend who is a research professional. They both confirmed that manual review was recommended as I had thought. That 

What tools did I use?

I tried a couple of different tools to analyse the data, rather than defaulting to excel. In particular, I tried AirTable and Google Sheets as they were integrated with Typeform. Air table was an interesting tool to engage with and had some nice functionality around graphs and dashboards, it didn't allow the analysis that I was after with my data set. Google sheets are good too, but in this case, I did end up using Excel as that is where my experience is in regard to data analysis, and I did not have the time to spend learning a new tool. (hmm this sounds familiar from my survey participants and their views on new tools!) All tools had different benefits, and I am pleased I explored other options. In particular, I see how Air Table could be something that would integrate a lot of different components of a business together so will be one to keep in mind for further ventures. 


For my first pass through the data, I exported it to a google sheet and took a shortcut that ended up creating rework in the future. I ended up doing a lot of one-off manual notes and calculations as I had thought that would be the extent of my responses. Fortunately, my response rate increased which meant I then needed to review my data again. This time I took a more systematic calculation approach with formulas. I summarised the data and calculated the percentages to help give more succinct insights. I did calculations for both groups (meeting my participant requirements and those that didn't to compare and contrast) as well as combined responses.  From this, I pulled key themes out of each of the areas in my Miro board to get a more consolidated review of the insights to enable me to see similarities and differences to determine insights. 

Kaupapa Māori Aproach 

I wanted to have a sense of whakawhanaungatanga/relationship building throughout the survey as much as that can be possible via a survey tool. I wanted it to feel warm, and I shared a little about my project in the introduction,  as well as in the social media posting where I shared the survey invitation.


In the opening, I acknowledged the gift of knowledge that they were sharing with me with a fitting whakataukī on the front page of the survey. I also noted that and want to give something back and create a community of knowledge return value and provides support.

Screen Shot 2022-10-15 at 10.04_edited.jpg


I tried to take a conversational approach to the opening of the survey, as well as the sections throughout. Typeform was great in allowing me to do this. It even had templates to help.  For example, introducing yourself, then asking people their name, incorporating that into a question asking a multichoice question about how their day was. I did make sure I had a conditional response depending on their answer. If there were not having a great day I wanted to acknowledge that.

I hoped that the steps above would create a sense of ahurutanga/psychological safety for them to be open and generous with their sharing in the survey. As always I have had kaitiakitanga/guardianship of their data in mind keeping it secure and anonymous. 

bottom of page