{"_id":"5447ea410319802200fc06df","category":{"_id":"5447be130319802200fc0620","project":"54471fc9e12a270800028adc","version":"54471fc9e12a270800028adf","__v":6,"pages":["5447be448d7af31a00dd406a","5447ea410319802200fc06df","5447ea600319802200fc06e3","5447ec908d7af31a00dd4115","5447ef6c0319802200fc070f","54485158c1b42e08005b82a1"],"sync":{"url":"","isSync":false},"reference":false,"createdAt":"2014-10-22T14:24:19.959Z","from_sync":false,"order":1,"slug":"how-do-i","title":"How do I...?"},"project":"54471fc9e12a270800028adc","__v":2,"user":"54471f91beb6320800da6f75","is_link":false,"version":{"_id":"54471fc9e12a270800028adf","__v":10,"project":"54471fc9e12a270800028adc","createdAt":"2014-10-22T03:08:57.750Z","releaseDate":"2014-10-22T03:08:57.750Z","categories":["54471fc9e12a270800028ae0","5447b9e7b96a63140077d747","5447be130319802200fc0620","5447ed118d7af31a00dd411c","5447ed230319802200fc0702","5448524c4544c30800241f41","544854504544c30800241f4d","544854af4544c30800241f50","544854e74544c30800241f51","54485557c1b42e08005b82bf"],"is_deprecated":false,"is_hidden":false,"is_beta":false,"is_stable":true,"codename":"","version_clean":"1.0.0","version":"1.0"},"updates":[],"next":{"pages":[],"description":""},"createdAt":"2014-10-22T17:32:49.749Z","link_external":false,"link_url":"","githubsync":"","sync_unique":"","hidden":false,"api":{"results":{"codes":[]},"try":true,"basic_auth":false,"auth":"never","params":[],"url":""},"isReference":false,"order":1,"body":"[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Project Organization\"\n}\n[/block]\nHow should you structure your specs?  Great question!  There's no absolutely-right or absolutely-wrong answer here.  Here's the method that I've found to work well.\n\nFirst, organize your specs into two projects: one for fast-running unit tests, and a second for slower-running integration tests. If you are also doing end-to-end tests, put those in a third project.\n\nNext, within the project, create a folder for each project you're going to write specs for.  If your solution contains projects named Acme.Core, Acme.Data, and Acme.Utility, create folders named Core, Data, and Utility.  \n\nWithin each of those folders, mirror the namespace/folder structure of the project you're going to test. If the full name for a class is Acme.Core.Factories.WidgetFactory, its unit tests should be in Acme.UnitSpecs.Core.Factories.WidgetFactorySpecs.  \n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/3MVe8NXTDWDayZYGmnpM_image.axd\",\n        \"image.axd\",\n        \"315\",\n        \"363\",\n        \"#379df9\",\n        \"\"\n      ]\n    }\n  ]\n}\n[/block]\n\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"File Structure\"\n}\n[/block]\nAgain, there are no right or wrong ways to structure your specs.  SpecsFor is flexible and can accommodate whatever conventions make sense to you.\n\nThat said, here's the approach that works for me.  Let's assume you are going to write specs for a class called WidgetFactory.  Create a new class named WidgetFactorySpecs.  Now, within this class, create a **nested** class for each scenario you are going to test.  For example, if the scenario is 'when creating a new widget', you would create a nested class named 'when_creating_a_new_widget.'  Make your scenarios derive from either SpecsFor<T> directly **or** from a common spec base class that establishes common state.\n\nWith this approach, you can easily and quickly toggle back and forth between a class and its specs, and you can run all the specs for a particular class directly from the file containing them.","excerpt":"","slug":"structure-my-specs","type":"basic","title":"Structure my specs"}

Structure my specs


[block:api-header] { "type": "basic", "title": "Project Organization" } [/block] How should you structure your specs? Great question! There's no absolutely-right or absolutely-wrong answer here. Here's the method that I've found to work well. First, organize your specs into two projects: one for fast-running unit tests, and a second for slower-running integration tests. If you are also doing end-to-end tests, put those in a third project. Next, within the project, create a folder for each project you're going to write specs for. If your solution contains projects named Acme.Core, Acme.Data, and Acme.Utility, create folders named Core, Data, and Utility. Within each of those folders, mirror the namespace/folder structure of the project you're going to test. If the full name for a class is Acme.Core.Factories.WidgetFactory, its unit tests should be in Acme.UnitSpecs.Core.Factories.WidgetFactorySpecs. [block:image] { "images": [ { "image": [ "https://files.readme.io/3MVe8NXTDWDayZYGmnpM_image.axd", "image.axd", "315", "363", "#379df9", "" ] } ] } [/block] [block:api-header] { "type": "basic", "title": "File Structure" } [/block] Again, there are no right or wrong ways to structure your specs. SpecsFor is flexible and can accommodate whatever conventions make sense to you. That said, here's the approach that works for me. Let's assume you are going to write specs for a class called WidgetFactory. Create a new class named WidgetFactorySpecs. Now, within this class, create a **nested** class for each scenario you are going to test. For example, if the scenario is 'when creating a new widget', you would create a nested class named 'when_creating_a_new_widget.' Make your scenarios derive from either SpecsFor<T> directly **or** from a common spec base class that establishes common state. With this approach, you can easily and quickly toggle back and forth between a class and its specs, and you can run all the specs for a particular class directly from the file containing them.