When to write in Gherkin?

Hi all, I understand that Gherkin can be used for BDD and this is what I’m writing in when reviewing stories etc.

Example Acceptance Criterion:
When I fill in the form with a location and submit the form, then an alert should be successfully created based on a location.

Scenario: Successfully create a location based alert
Given I fill in the location field
When I press submit
Then an alert should be successfully created based on a location

But what about this? Gherkin with specific test scenarios that are more technical in nature?

Scenario: Should not be able to create an invalid location based alert after deleting
Given I fill in the location field
And I press submit
And an alert should be successfully created based on a location
And I delete the alert
When I fill in the location field with an invalid location
And I press submit
Then an alert should not be successfully created based on a location

At the moment, we are writing everything in Gherkin but it feels a bit weird since in reality the product owner wouldn’t care about these specific scenarios. However it doesn’t matter because at this point in time, no one sees the feature files other than the QA team.

I was wondering what other people do, do you write these tests in Gherkin? What format do you use and do you separate them from your Gherkin scenarios?

Just to add to this, there is another scenario that I have as well in regards to testing email templates. Some of my UI tests end up looking like this which looks kind of horrible.

Scenario: Test email template
Given I receive the email for my alert
Then I should see the header image as red
And I should see that the link A is in X style
And I should see that font size is 14
And I should see that the body is ‘Welcome to your alert, this is the best email in the world’

Or worse when I have less time:
Scenario: Test email template
Given I receive the email for my alert
Then it should match the mock

We use BDD for about anything :slight_smile:
It’s a great way to agree on the topic and to avoid paradigms . 3 amigo meetings are easier if you all talk the same language.
If you create your feature files, why don’t you use them in your 3 amigo’s ? Let business validate them and share them with the Dev’s ?

Also try to avoid things like “I” , we use persona’s to create our BDD’s.

Your example aout the location field
Given the customer fills in the location field
When the customer submits the request
Then an alert is create based on the location

Words like should or a are ambiguous and have to be avoided in order not to create a vague test. The alert being created is a success on it’s own and doesn’t need to be mentioned :slight_smile:
Avoid using AND too much. I use the following rule of thumb. If the BDD cannot be written in 5 lines , split it or create example mapping to complete it

Else if I would follow your test case
Given I fill in the following location : Brussels
When I click on submit
The location of Amsterdam is shown

Also correct according to your test case since Amsterdam is “a” location :slight_smile:

BDD is the perfect tool to rule out ambiguous testing and results.

Hope this helps a bit

Something that may help you is this article: https://automationpanda.com/2017/01/30/bdd-101-writing-good-gherkin/

It teaches you with examples how to write good gherkin and may help you.

@kgysenbergs - thanks for the feedback, I will definitely try to take into account your Gherkin guidelines. We don’t actually have the 3 amigo meetings at the moment. It’s more just Dev + Business at the moment. QA has joined in a bit later so now I am writing the test cases based on the already established stories.

Yup, already know about this one, thanks for flagging it!

Question, do you even need to use personas if the feature file already says 'As a ’ anyway?

You use Gherkin for anything? Are you sure? The example I gave for ‘Scenario: Should not be able to create an invalid location based alert after deleting’ is quite technical and more bug specific. A more direct example could be anything like ‘Check the database to see that the alert is created successfully’.

Gherkin can be used to write pretty much any kind of test. Manual or automated; BDD or functional; or a mix of all of the above (not to mention as suited to waterfall as it is to Agile or anything else). I tend to view it as a different way to write test cases to the very formal definition I used, for a time, back in the day. So user, preconditions, test conditions, test data (given, and, examples), action (when), one or more outcomes and post conditions (then, and).

What I have seen around Gherkin and other ideas that seem to have happened around the whole Agile thing is quite a lot of Dogma. I think that limits thinking in terms of how/when/why some of these ideas are used (and some are very useful in situations that have nothing to do with tech/software development).

Thanks @darth_piriteze. It makes sense what you are saying and I can see how Gherkin can be used to write any kind of test. I am not saying that I am restricted with any rules around when I can write in Gherkin.

The question I am asking is a bit different. I am exploring whether or not I should include absolutely every test in a feature file. The thing is, I don’t think that the product owner cares about technical tests, I gave an example in my original post (Scenario: Should not be able to create an invalid location based alert after deleting). This feels more like a tester thing.

In a nutshell, this is what I am facing:

  1. Write feature file - include story related tests for acceptance criteria and all technical related tests? (this is what I am currently doing and I am just putting a tag called ‘@BugSpecific’ for anything that seems more technical in nature). If the product owner reviews the feature file, they can just skip over the bug specific stuff if they want.
  2. Perhaps split it out into two feature files for example: Calculator.feature and CalculatorTechnical.feature? This way, the product owner just needs to read Calculator.feature.

Admittedly, we are not actually at the stage where we actually have a product owner reviewing the scenarios (it all sits with the QA team). However, I am still curious and I am trying to get everyone’s opinions on how they have organised their tests.

Hopefuly that makes sense.

Perhaps split your tests into a verification group and an acceptance group. All the ugly stuff (long lists of examples to more exhaustively test error conditions, for example) can go under verification, while acceptance tests just need to demonstrate some acceptance criteria are met. This way, you might re-use some of the test cases but with smaller examples for acceptance.