azure-devopsconditional-statementsazure-pipelinesmulti-projectproject-reference

How do I add a conditional attribute in my project file to detect if the code is being built locally versus built in Azure DevOps?


We have decided that even though our data layer is in a separate Azure project, we want to include the .csproj file in our solution so that we are able to debug through API and Datalayers. We do not want to use nuget, because you lose that ability to debug and make changes on the fly.

In the Azure DevOps pipeline, I have set up the datalayer to publish the dll and pdb files to the pipeline artifacts. And I have the Artifacts being downloaded to the API pipeline, but when I build, I get this error. I have also excluded the 'MyDataLayer*.csproj' from the build task, so that it is only building the projects in the pipeline's repo.

Error CS0246: The type or namespace name 'MyDataLayer.Data' could not be found (are you missing a using directive or an assembly reference?)

It makes sense since I excluded it. But how do I change the in the project file, to point to the downloaded reference files?

I was thinking maybe a conditional attribute in the project file, but I haven't found what kind of conditional attribute would allow this.

I think I'd like something like this... I've added the psuedo code and of course changed the names of objects for security.

    <ItemGroup Condition="%where build is done through VS locally%">
        <ProjectReference Include="..\..\MyProject\MyDataLayer.Data\MyDataLayer.Data.csproj" />
        <ProjectReference Include="..\..\MyProject\MyDataLayer.Models\MyDataLayer.Model.csproj" />
    </ItemGroup>

    <ItemGroup Condition="%where build is done through Azure DevOps%">
        <ProjectReference Include="ArtifactPath\MyDataLayer.Data\MyDataLayer.Data.csproj" />
        <ProjectReference Include="ArtifactPath\MyDataLayer.Models\MyDataLayer.Model.csproj" />
    </ItemGroup>

Does anyone out there know how this can be done? Is there another way that I haven't found online to accomplish this?

I was expecting that the Azure DevOps build would pick up the references that were downloaded and use those as references, but that didn't happen.

Maybe I'm downloading them to the wrong path?

My Tasks:

- task: DownloadPipelineArtifact@2
  inputs:
    buildType: 'specific'
    project: 'xxxxxxxx-xxxx-xxxx-xxxxx-xxxxxxxxxxx'
    definition: '57'
    buildVersionToDownload: 'latest'
    itemPattern: '/**/?(*.dll|*.pdb)'
    targetPath: '$(Pipeline.Workspace)'

Solution

  • Thank you to @DrDoomPDX, I was able to fix it for our needs in this way.

    First, I added a new release type by adding 'Pipeline' as an option. I will use this for the API build pipeline so that it knows what to build, and the Data projects also have this new build config added to the project files.

    <Configurations>Debug;Release;Pipeline</Configurations>
    

    Then I added the following to my data project(s) in my api.csproj file.

    <!--PRERELEASE config is for the Azure DevOps build structure. If addition projects are added, please follow this pattern.-->
    <Choose>
        <When Condition=" '$(Configuration)|$(Platform)' == 'Pipeline|AnyCPU' ">
            <ItemGroup>
                <Reference Include="MyData.Database">
                    <HintPath>..\SharedLib\net6.0\MyData.Database.dll</HintPath>
                </Reference>
            </ItemGroup>
        </When>
        <Otherwise>
            <ItemGroup>
                <ProjectReference Include="..\..\DataComponents\MyData.Database\MyData.Database.csproj" />
            </ItemGroup>
        </Otherwise>
    </Choose>
    

    Then in my Azure pipeline build, I use the build configuration 'Pipeline' and pull the artifacts from the previous pipeline build and copy the DLL's to 'SharedLib\net6.0'

    variables:
      buildConfiguration: 'pipeline'
    
      - task: DownloadPipelineArtifact@2
        inputs:
          buildType: 'specific'
          project: 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx'
          definition: '57' #alpha
          buildVersionToDownload: 'latest'
          targetPath: '$(System.DefaultWorkingDirectory)/Artifacts'
    
      - task: CopyFiles@2    #Copies the DLL's from previous build to the folder where the api.csproj expects them to be
        inputs:
          SourceFolder: '$(System.DefaultWorkingDirectory)/Artifacts'
          Contents: '**/MyData.*/bin/$(buildConfiguration)/net6.0/**'
          TargetFolder: '$(System.DefaultWorkingDirectory)/SharedLib/net6.0'
          flattenFolders: true 
          retryCount: '3' 
          delayBetweenRetries: '1000' 
          
      - task: DeleteFiles@1  #To Clean up the extra files
        inputs:
          SourceFolder: '$(System.DefaultWorkingDirectory)/Artifacts'
          Contents: '**'
          RemoveSour
    

    ceFolder: true

    And then I build the API project and the dll's from the DataComponents build are found in the SharedLib/net6.0 folder and incorporated into the API build. Now I can debug everything in one solution in VS2022 and fix code on the fly with for the DataComponents and the build will handle this situation.

    Caveat: Make sure you check in the DataComponents and build them first, then check in the API code so that the DataComponents you updated are in the latest pipeline artifact location for the second build.

    Again, Thank to DrDoomPDX for the api project file solution, that spawned the ideas. I will turn this into a build template, now that it is working, so that I can use it elsewhere too.

    I hope this post helps others as well. ~Velvet