Cloud Services

Enterprise Marketplace

Bring Your Own Catalog Item tool UI known issues
Published On Nov 06, 2024 - 7:20 AM

Bring Your Own Catalog Item tool UI known issues

Bring Your Own Catalog Item known issues available.
The following are the known issues with the Bring Your Own Catalog Item tool when used in the user interface, divided by provider:

Alibaba Cloud

Alibaba Cloud
User is not able to cancel discovery from the
Catalog Admin
page (feature is not implemented yet).
User is not able to delete multiple catalogs from the draft section.
Multiple catalogs cannot be retired simultaneously (feature is not implemented yet).
User is unable to unretire multiple catalogs (feature is not implemented yet).
User cannot delete multiple catalogs from the Retired state (feature is not implemented yet).
New import job will not start if a previous job is still in progress.
When
Import from File
is performed after
Import from Provider Account
, the Alibaba Cloud history displays the name of the provider account for the catalog that was imported from file.
If an exported file is imported multiple times, the import only gets passed the first time. All subsequent attempts fail.
If some catalogs have been selected in the
Draft
tab and the user navigates to another tab, the catalogs are auto selected.
After entering valid credentials (GitHub token) and selecting
TestConnection
, the user gets an error message saying "Invalid Credentials".
If you try to import a zip that contains files from multiple providers (such as Azure and Alibaba), then discovery gets completed for Alibaba files and fails for Azure files.

Amazon Web Services

Amazon Web Services
If the user exports any catalog from a server/tenant and tries to import it again on the same tenant, it will fail because the service id must be unique for every catalog on a particular tenant. To onboard the same exported content on the same tenant/server, the user must add a unique service ID in all the required places in the exported content.
Parsing of AWS native YAML template (with short syntax form intrinsic functions) is technically not possible in nodejs for
cb-aws-catalog-int
. The library
'cfn-flip'
is the only Python library that can support AWS native YAML template conversion that is already supported in CLI based Bring Your Own Catalog Item.
Even if the array
'icbCatalogRepository'
in the
icb_catalog_metadata.json
file will have multiple objects defined for the AWS provider with different sourcepaths, it will only discover the content from one sourcePath.
The catalog does not support a user-defined
sourcePath:"somezipfile.zip"
file, so integration does not receive valid zip content and the job fails. However, if users define
sourcePath:"/"
and place the .zip file containing native templates in this directory, then Bring Your Own Catalog Item is successful.
When users import an empty .zip file, the catalog does not send valid zip content to integration and the job fails.
Users are unable to re-import servificied content that has been deleted.
The
Import Failed Count
field in the job status email always shows the count as 0 even if failures occurred during the job.
The Cancel discovery feature has not yet implemented from the Integrations side.

Google Cloud Platform

Google Cloud Platform
An exported zip file will contain the exported catalogs in individual folders. Each folder will be named using the serviceID of the respective catalog.
The UI allows the user to export catalogs in WIP state, but this format is not supported from the integrations side. Therefore the user will get a success message after exporting WIP catalog, but the catalog will not be available in the zip file.
The source directory will contain one or many catalog directories of catalogs that need to be created. Each directory under source folder will be treated as an individual catalog.
├── my_source_dir
│ └── my_service
│ └───────────file_name.yaml
│ └───────────file_name_1.py
...
│ └───────────file_name_n.py{{}}
Every zip file must have a
icb_catalog_metadata.json
file
The exported zip files cannot be edited. They can only be exported from one tenant and imported into a different tenant.
The user can only edit the category and labels of a catalog.
While adding git credentials, the user cannot validate them.
For existing tenants, the user must explicitly set the
discoverContent
and
manageSOEnabled
flags to
true
.
If a user selects catalogs in the
Draft
tab and then navigates to another tab, the catalogs are auto-selected.
The option to cancel discovery is not yet available from the Integrations side.
Users cannot perform operations such as publish and delete on a group of catalogs. These actions must be performed on individual catalogs.
The
Import Failed Count
field in the job status email always shows the count as 0 even if there are failures in the job.
The maximum size of exported files is 10 MB. Any file above this size cannot be imported.
The user needs to re-upload templates for OOB because the multi-quantity disabled flag was added for the catalogs where it is not applicable. Therefore, after the EDIT operation user cannot change the multi-quantity field.
Even if the array
'icbCatalogRepository'
in the
icb_catalog_metadata.json
file has multiple objects defined for the GCP provider with different sourcepaths added in it, it will discover the content from one sourcePath only.
A nested .zip file structure is not supported.
A new import job will not start if a previous job is still in progress.

IBM Cloud

IBM Cloud
Import of exported files fails intermittently with the following error:
60 Min in above mentioned time, as after 60 min discovery gets failed if not complete
.
Each folder will be consider as one catalog while importing (folders under
sourcePath
in
icb_catalog_metadata.json
).
Importing a .zip that has more than one .tf file for same catalog (
main.tf
,
variables.tf
,
provider.tf
) under
sourcePath
without those files being isolated in folders will fail.
Importing of exported files to the same environment will fail because the catalog name must be unique in the environment.
Importing of exported content to the same environment will fail if the user only deletes the catalog from Draft status. The catalog must be Retired and then Deleted before a catalog with the same name can be imported.
The serviceOffering
Group
field while editing the catalog will not be used on the INT side.
Editing only the label will not change anything on the INT side.
When importing from a GitHub URL, the user must create another provider to do the discovery for the intended provider. The JOB history will appear on the intended provider, not the one that the user selects while doing discovery.
The
icb_catalog_metadata.json
file must contain provider code as
ibmcloud
if doing discovery for IBM Cloud.
Exported data is not encoded, and therefore should not be edited.
Multiple provider export is not enabled.
Discovery cancel is not enabled.
The Base Pricing will be N/A for servicified templates on the
Details
tab for the catalog.
Catalogs need to be edited before being published to be functional. Therefore SSL Certificate and Remote Exec will not be functional because editing the config is not supported yet.
For zero price catalogs, a random string (serviceID) is displayed as the catalog name in the
Pricing
pane on the
Review Order
page.
When a catalog is updated from draft to published/retired/deleted, the confirmation message displays the serviceID instead of the name of the catalog.

Microsoft Azure

Microsoft Azure
If only JSON files are provided to be imported at the source path, they will be inserted with "JSON Name" as the serviceName.
If some templates are invalid during import, the failed count increases and the Import JOB fails, but all the valid templates are servicified successfully.
If files are included in the import source folder with unsupported extensions, such files are ignored and not considered as templates.
During the import of exported templates, do not modify the contents of the .zip file because doing so might result in failures.
If catalogs already exist in the Azure DB with the same catalog name during import of exported templates, then those catalogs are skipped and not uploaded.
If the user deletes catalogs that are in DRAFT or WORKINPROGRESS state from the
Catalog Admin
page, those catalogs are deleted from catalog database, but they become stale in Azure content server database.
When updating the price of DRAFT or WORKINPROGRESS catalogs, the price file is expected in the offer price file format.
Updating the base price of DRAFT or WORKINPROGRESS catalogs is not supported by the Catalog side, so the base price change will not take effect.
If a folder contains nested folders, all the valid templates within the folders will be uploaded with the catalog name as the source path.
The Parameters JSON and metadata JSON will be identified and skipped during Bring Your Own Catalog Item of the templates.
Do you have two minutes for a quick survey?
Take Survey