Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New directory for third party tools of interest for GrimoireLab #230

Merged
merged 3 commits into from
Dec 16, 2019

Conversation

jgbarah
Copy link
Contributor

@jgbarah jgbarah commented Oct 30, 2019

For now, it inclues Dockerfiles and other related files to produce FOSSology Debian packages, via a Docker container, and a Docker container like grimoirelab/full with nomos also installed.

For now, it inclues Dockerfiles and other related files to
produce FOSSology Debian packages, via a Docker container,
and a Docker container like grimoirelab/full with nomos also
installed.

Signed-off-by: Jesus M. Gonzalez-Barahona <[email protected]>
@valeriocos valeriocos self-requested a review December 1, 2019 19:33
Copy link
Member

@valeriocos valeriocos left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have tested the PR and suggested some changes. After the changes, it was possible to produce colic and cocom data. Please find below the configuration files, the docker command and some screenshots of the result.

projects.json

{
    "grimoirelab": {
      "meta": {
        "title": "GrimoireLab"
      },
      "git": [
          "https://github.com/chaoss/grimoirelab-toolkit"
        ],
      "cocom": [
         "https://github.com/chaoss/grimoirelab-toolkit"
       ],
      "colic": [
         "https://github.com/chaoss/grimoirelab-toolkit"
       ],
      "github": [
         "https://github.com/chaoss/grimoirelab-toolkit"
       ]
    }
  }

credentials.cfg

[github]
api-token = ccf50...

dashboard.cfg

[general]
# Update incrementally, forever
update = true
# Don't start a new update earlier than (since last update, seconds)
min_update_delay = 300
# Produce debugging data for the logs
debug = true

[es_enrichment]
# Refresh identities and projects for all items after enrichment
autorefresh = true

[sortinghat]
# Run affilation
affiliate = True
# How to match to unify
matching = [email]
# How long to sleep before running again, for identities tasks
sleep_for = 100

[panels]
# Dashboard: default time frame
kibiter_time_from = "now-1y"
# Dashboard: default index pattern
kibiter_default_index = "git"
# GitHub repos panels
code-complexity = true
code-license = true

[phases]
collection = true
identities = true
enrichment = true
panels = true

[git]
# Names for raw and enriched indexes
raw_index = git_grimoirelab-raw
enriched_index = git_grimoirelab
studies = [enrich_demography:git, enrich_areas_of_code:git, enrich_onion:git]

[github]
# Names for raw and enriched indexes
raw_index = github_grimoirelab-raw
enriched_index = github_grimoirelab
# Sleep it GitHub API rate is exhausted, waited until it is recovered
sleep-for-rate = true

[cocom]
raw_index = cocom_chaoss
enriched_index = cocom_chaoss_enrich
category = code_complexity_lizard_file
studies = [enrich_cocom_analysis]
branches = master
git-path = /tmp/git-cocom
worktree-path = /tmp/cocom/

[enrich_cocom_analysis]
out_index = cocom_chaoss_study
interval_months = [3]

[colic]
raw_index = colic_chaoss
enriched_index = colic_chaoss_enrich
category = code_license_nomos
studies = [enrich_colic_analysis]
exec-path = /usr/share/fossology/nomos/agent/nomossa
branches = master
git-path = /tmp/git-colic
worktree-path = /tmp/colic

[enrich_colic_analysis]
out_index = colic_chaoss_study
interval_months = [6]

[enrich_demography:git]
#no_incremental = true   # default: false

[enrich_areas_of_code:git]
in_index = git_grimoirelab-raw
out_index = git_aoc_grimoirelab-enriched
#sort_on_field = metadata__timestamp
#no_incremental = false

[enrich_onion:git]
in_index = git_grimoirelab
out_index = git_onion_grimoirelab-enriched
#data_source = git
#contribs_field = hash
#timeframe_field = grimoire_creation_date
#sort_on_field = metadata__timestamp
#no_incremental = false

docker command

docker run -p 5601:5601 -p 9000:9200 -v $(pwd)/projects.json:/projects.json -v $(pwd)/dashboard.cfg:/dashboard.cfg -v $(pwd)/credentials.cfg:/override.cfg -t grimoirelab/full-nomos

captura_154

captura_155

sudo find /var/lib/apt/lists -type f -delete && \
sudo rm /tmp/fossology-common_3.6.0-1_amd64.deb \
/tmp/fossology-nomos_3.6.0-1_amd64.deb

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nomos needs to be set up by running the conf script pgsql-conf-fix.sh located in /usr/share/fossology/setup.

We could add also other packages needed by Graal, a possible modification could be:

@@ -15,12 +15,25 @@ ADD build/fossology-nomos_3.6.0-1_amd64.deb /tmp
 RUN sudo apt-get update && \
     sudo apt-get -y install /tmp/fossology-common_3.6.0-1_amd64.deb \
        /tmp/fossology-nomos_3.6.0-1_amd64.deb \
+        python3 \
+        python3-pip \
+        cloc \
         && \
     sudo apt-get clean && \
     sudo find /var/lib/apt/lists -type f -delete && \
     sudo rm /tmp/fossology-common_3.6.0-1_amd64.deb \
        /tmp/fossology-nomos_3.6.0-1_amd64.deb
 
+RUN pip3 install setuptools
+RUN pip3 install lizard>=1.16.3
+RUN pip3 install pylint>=1.8.4
+RUN pip3 install flake8>=3.7.7
+RUN pip3 install networkx>=2.1
+RUN pip3 install pydot>=1.2.4
+RUN pip3 install bandit>=1.4.0
+
+CMD /usr/share/fossology/setup/pgsql-conf-fix.sh <--- setup command needed by Nomos
+
 # Entrypoint
 ENTRYPOINT [ "/entrypoint.sh" ]

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think Python3 is already in the container (it is used to run GrimoireLab tools). Same for setuptools, I think. Why are you proposing several different pip install commands? Can't we just install the main ones, and the rest would come in as dependencies? Some of them seem to be related to testing, not to the running packages (eg pylint or flake8): do we really need them in the container?

Copy link
Member

@valeriocos valeriocos Dec 4, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think Python3 is already in the container (it is used to run GrimoireLab tools). Same for setuptools, I think.

OK

Why are you proposing several different pip install commands? Can't we just install the main ones, and the rest would come in as dependencies?

Yes, you are right, it's better to get them as dependencies of graal.

Some of them seem to be related to testing, not to the running packages (eg pylint or flake8): do we really need them in the container?

Pylint and flake8 are used by some graal analyzers (https://github.com/chaoss/grimoirelab-graal/tree/master/graal/backends/core/analyzers), but they can be installed as deps of graal

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK. Then, can I understand that from your diff, only the following two lines are needed?:

+        cloc \
...
+CMD /usr/share/fossology/setup/pgsql-conf-fix.sh 

For the second one, are we really needing it, even when we're using MySQL and not PostgreSQL?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried the PR again, from a fresh env. I included only cloc and eveything worked smoothly.

It would be possible to change the name of the image with something not related to nomos (maybe in the future there will be more things inside), for instance full-thirdparties WDYT?

@svdo
Copy link

svdo commented Dec 2, 2019

Nice progress folks! Can I try this myself? The full-nomos image is not on docker hub yet apparently.

@valeriocos
Copy link
Member

valeriocos commented Dec 2, 2019

sure @svdo ! To create the image, please follow the info at third-party/README.md. Don't hesitate to write if something isn't clear, thanks!

@svdo
Copy link

svdo commented Dec 2, 2019

I got the build to work and created the docker image. When running it I encounter some issues though.

First is that I get errors about cloc:

2019-12-02 09:52:52,807 - grimoire_elk.elk - ERROR - Error feeding raw from cocom (https://<repo-url>): [Errno 2] No such file or directory: 'cloc'
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/grimoire_elk/elk.py", line 228, in feed_backend
    ocean_backend.feed(**params)
  File "/usr/local/lib/python3.5/dist-packages/grimoire_elk/raw/elastic.py", line 230, in feed
    self.feed_items(items)
  File "/usr/local/lib/python3.5/dist-packages/grimoire_elk/raw/elastic.py", line 246, in feed_items
    for item in items:
  File "/usr/local/lib/python3.5/dist-packages/perceval/backend.py", line 215, in fetch
    for item in self.fetch_items(category, **kwargs):
  File "/usr/local/lib/python3.5/dist-packages/graal/graal.py", line 182, in fetch_items
    raise e
  File "/usr/local/lib/python3.5/dist-packages/graal/graal.py", line 175, in fetch_items
    commit['analysis'] = self._analyze(commit)
  File "/usr/local/lib/python3.5/dist-packages/graal/backends/core/cocom.py", line 188, in _analyze
    file_info = self.analyzer.analyze(local_path)
  File "/usr/local/lib/python3.5/dist-packages/graal/backends/core/cocom.py", line 241, in analyze
    cloc_analysis = self.cloc.analyze(**kwargs)
  File "/usr/local/lib/python3.5/dist-packages/graal/backends/core/analyzers/cloc.py", line 121, in analyze
    message = subprocess.check_output(cloc_command).decode("utf-8")
  File "/usr/lib/python3.5/subprocess.py", line 316, in check_output
    **kwargs).stdout
  File "/usr/lib/python3.5/subprocess.py", line 383, in run
    with Popen(*popenargs, **kwargs) as process:
  File "/usr/lib/python3.5/subprocess.py", line 676, in __init__
    restore_signals, start_new_session)
  File "/usr/lib/python3.5/subprocess.py", line 1282, in _execute_child
    raise child_exception_type(errno_num, err_msg)
FileNotFoundError: [Errno 2] No such file or directory: 'cloc'

The second is that Elasticsearch seems to die after a while. Not sure if related to the first issue, and didn't find more info about what could be causing this...

@valeriocos
Copy link
Member

Did you modify the third-party/Dockerfile-grimoirelab-nomos with the suggested changes at #230 (comment)?

@svdo
Copy link

svdo commented Dec 2, 2019

I did now 😉 That solves the issue about cloc, but elastic is still giving up on me. I don't have time to look into it atm, maybe later. Thanks for your assistance again!

@valeriocos
Copy link
Member

You're welcome @svdo ! WRT the second issue, maybe it's just a matter of giving more resources to Elasticsearch: sudo sysctl -w vm.max_map_count=262144 (at system level)

If you can, please share the logs of Elasticsearch, thanks

@svdo
Copy link

svdo commented Dec 2, 2019

That seems to work (or more specifically the Mac version of that: https://stackoverflow.com/a/41251595). It's indexing...

@valeriocos
Copy link
Member

thanks for the feeback @svdo ! I should have pointed you to: https://github.com/chaoss/grimoirelab/tree/master/docker-compose#mac, sorry

If you want to check the evolution of your indexes, you can use:
<elasticsearch_url>/_cat/indices?pretty

@svdo
Copy link

svdo commented Dec 2, 2019

Hmm, it failed again. Tail of /var/log/elasticsearch/elasticsearch.log in the docker:

[2019-12-02T16:53:41,151][DEBUG][o.e.a.s.TransportSearchAction] [0xUhiia] [gitlab_merge_requests-raw][4], node[0xUhiiaKRVmYTAXpQJOQKQ], [P], s[STARTED], a[id=C5nrVp88Rk2louijMqEfsg]: Failed to execute [Sea
rchRequest{searchType=QUERY_THEN_FETCH, indices=[gitlab_merge_requests-raw], indicesOptions=IndicesOptions[id=38, ignore_unavailable=false, allow_no_indices=true, expand_wildcards_open=true, expand_wildcar
ds_closed=false, allow_aliases_to_multiple_indices=true, forbid_closed_indices=true, ignore_aliases=false], types=[], routing='null', preference='null', requestCache=null, scroll=Scroll{keepAlive=10m}, max
ConcurrentShardRequests=5, batchedReduceSize=512, preFilterShardSize=128, source={"size":100,"query":{"bool":{"filter":[{"term":{"origin":{"value":"https://<repository-url>","boost":1.0}}}],"adjust_pure_negative":true,"boost":1.0}},"sort":[{"metadata__timestamp":{"order":"asc"}}]}}]
org.elasticsearch.transport.RemoteTransportException: [0xUhiia][127.0.0.1:9300][indices:data/read/search[phase/query]]
Caused by: org.elasticsearch.index.query.QueryShardException: No mapping found for [metadata__timestamp] in order to sort on
	at org.elasticsearch.search.sort.FieldSortBuilder.build(FieldSortBuilder.java:319) ~[elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.sort.SortBuilder.buildSort(SortBuilder.java:155) ~[elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.SearchService.parseSource(SearchService.java:718) ~[elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.SearchService.createContext(SearchService.java:552) ~[elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:528) ~[elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:324) ~[elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.SearchService$2.onResponse(SearchService.java:310) [elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.SearchService$2.onResponse(SearchService.java:306) [elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.SearchService$3.doRun(SearchService.java:996) [elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:637) [elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.common.util.concurrent.TimedRunnable.doRun(TimedRunnable.java:41) [elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-6.1.4.jar:6.1.4]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_232]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_232]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_232]
[2019-12-02T16:53:41,153][DEBUG][o.e.a.s.TransportSearchAction] [0xUhiia] All shards failed for phase: [query]
org.elasticsearch.index.query.QueryShardException: No mapping found for [metadata__timestamp] in order to sort on
	at org.elasticsearch.search.sort.FieldSortBuilder.build(FieldSortBuilder.java:319) ~[elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.sort.SortBuilder.buildSort(SortBuilder.java:155) ~[elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.SearchService.parseSource(SearchService.java:718) ~[elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.SearchService.createContext(SearchService.java:552) ~[elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:528) ~[elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:324) ~[elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.SearchService$2.onResponse(SearchService.java:310) [elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.SearchService$2.onResponse(SearchService.java:306) [elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.search.SearchService$3.doRun(SearchService.java:996) [elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:637) [elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.common.util.concurrent.TimedRunnable.doRun(TimedRunnable.java:41) [elasticsearch-6.1.4.jar:6.1.4]
	at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-6.1.4.jar:6.1.4]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_232]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_232]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_232]

@valeriocos
Copy link
Member

No mapping found for [metadata__timestamp] in order to sort on

I'm not totally sure but this error should pop up either:

  • (i) during the enrichment phase for git when the raw index is empty
  • (ii) during the study phase for git when the enriched index is empty

If this is the case, you should delete the folders ~/.perceval and ~/.graal and restart. These folders are used by perceval and graal to store mirrors of the git repos analyzed.

If it's possible, could you share your projects.json and dashboard.cfg/setup.cfg? I'll try to replicate your issue tomorrow.

@svdo
Copy link

svdo commented Dec 3, 2019

Ok I'm a step further now. I think giving more memory to docker (4GB instead of 2GB) seems to make elasticsearch happier. I have taken a small project and indexed it. The colic and cocom` related indexes are filled.

Problem is now that I don't see any dashboards about this. The top navigation bar does have the "Code Complexity" and "Code License" entries, but the submenus of both are empty. Any ideas?

Projects.json

{
  "Fresco": {
    "git": ["https://github.com/philips-software/fresco-logistic-regression-2"],
    "github": ["https://github.com/philips-software/fresco-logistic-regression-2"],
    "colic": ["https://github.com/philips-software/fresco-logistic-regression-2"],
    "cocom": ["https://github.com/philips-software/fresco-logistic-regression-2"]
  }
}

Config

[general]
short_name = Grimoire
update = true
# Don't start update earlier than elapsed seconds since start of last
min_update_delay = 300
debug = true
# Number of items per bulk request to Elasticsearch
bulk_size = 100
# Number of items to get from Elasticsearch when scrolling
scroll_size = 100

[es_collection]
url = http://localhost:9200

[es_enrichment]
# Refresh identities and projects for all items after enrichment
autorefresh = true

### [sortinghat]
### # Run affilation
### affiliate = True
### # How to match to unify
### ### matching = [email]
### # How long to sleep before running again, for identities tasks
### sleep_for = 100

[panels]
gitlab-issues = true
gitlab-merges = true
# Dashboard: default time frame
kibiter_time_from = "now-1y"
# Dashboard: default index pattern
kibiter_default_index = "git"
# GitHub repos panels
code-complexity = true
code-license = true

[phases]
collection = true
identities = true
enrichment = true
panels = true

[git]
# Names for raw and enriched indexes
raw_index = git_grimoirelab-raw
enriched_index = git_grimoirelab
studies = [enrich_demography:git, enrich_areas_of_code:git, enrich_onion:git]

[github]
# Names for raw and enriched indexes
raw_index = github_grimoirelab-raw
enriched_index = github_grimoirelab
# Sleep it GitHub API rate is exhausted, waited until it is recovered
sleep-for-rate = true

[cocom]
raw_index = cocom_chaoss
enriched_index = cocom_chaoss_enrich
category = code_complexity_lizard_file
studies = [enrich_cocom_analysis]
branches = master
git-path = /tmp/git-cocom
worktree-path = /tmp/cocom/

[enrich_cocom_analysis]
out_index = cocom_chaoss_study
interval_months = [3]

[colic]
raw_index = colic_chaoss
enriched_index = colic_chaoss_enrich
category = code_license_nomos
studies = [enrich_colic_analysis]
exec-path = /usr/share/fossology/nomos/agent/nomossa
branches = master
git-path = /tmp/git-colic
worktree-path = /tmp/colic

[enrich_colic_analysis]
out_index = colic_chaoss_study
interval_months = [6]

[enrich_demography:git]
#no_incremental = true   # default: false

[enrich_areas_of_code:git]
in_index = git_grimoirelab-raw
out_index = git_aoc_grimoirelab-enriched
#sort_on_field = metadata__timestamp
#no_incremental = false

[enrich_onion:git]
in_index = git_grimoirelab
out_index = git_onion_grimoirelab-enriched
#data_source = git
#contribs_field = hash
#timeframe_field = grimoire_creation_date
#sort_on_field = metadata__timestamp
#no_incremental = false

@valeriocos
Copy link
Member

Ok I'm a step further now. I think giving more memory to docker (4GB instead of 2GB) seems to make elasticsearch happier. I have taken a small project and indexed it. The colic and cocom` related indexes are filled.

Great!

Problem is now that I don't see any dashboards about this. The top navigation bar does have the "Code Complexity" and "Code License" entries, but the submenus of both are empty. Any ideas?

This is a bug (I guess), it will be fixed in the next release. Please click on the Dashboard button and search for colic (or cocom).

captura_156

Thanks!

@svdo
Copy link

svdo commented Dec 3, 2019

Cool that works. A couple of things that stand out to me:

  • We don't have to put license and copyright info in every file anymore. That makes the license analysis not so useful on this level of detail I guess. For us it's more like "does it have a LICENSE file" and if yes: what is the license type.
  • Is it also possible to specify the intervals in weeks / days instead of just months? Especially the delta on code complexity would be interesting on a week-by-week basis.

@valeriocos
Copy link
Member

Thank you for your feedback @svdo

We don't have to put license and copyright info in every file anymore. That makes the license analysis not so useful on this level of detail I guess. For us it's more like "does it have a LICENSE file" and if yes: what is the license type.

Just to make sure I got it right, you would like to know if a repository has a license file and if so, which license type, right?

Is it also possible to specify the intervals in weeks / days instead of just months? Especially the delta on code complexity would be interesting on a week-by-week basis.

This requires some changes in the code, but it shouldn't be difficult. Can you open an issue in ELK about this feature request?

@svdo
Copy link

svdo commented Dec 5, 2019

Just to make sure I got it right, you would like to know if a repository has a license file and if so, which license type, right?

Exactly. Something else that is probably a lot more complicated but also very valuable would be all licenses of dependencies that are used, either directly or indirectly... This is a big thing for enterprises in terms of open source license compliance.

This requires some changes in the code, but it shouldn't be difficult. Can you open an issue in ELK about this feature request?

Done!

@valeriocos
Copy link
Member

Exactly. Something else that is probably a lot more complicated but also very valuable would be all licenses of dependencies that are used, either directly or indirectly... This is a big thing for enterprises in terms of open source license compliance.

That sounds really interesting. Can you open an issue in graal repository about it?

Done!

Thanks

@svdo
Copy link

svdo commented Dec 9, 2019

That sounds really interesting. Can you open an issue in graal repository about it?

Done and done!

@valeriocos
Copy link
Member

thanks @svdo !

valeriocos and others added 2 commits December 13, 2019 15:22
[third-party] Rename image and add cloc dep
Copy link
Member

@valeriocos valeriocos left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks @jgbarah

@valeriocos valeriocos merged commit 1398422 into chaoss:master Dec 16, 2019
valeriocos added a commit to valeriocos/grimoirelab that referenced this pull request Dec 16, 2019
This code showcases the 3p image via an example, which
has been developed at chaoss#230

Signed-off-by: Valerio Cosentino <[email protected]>
valeriocos added a commit that referenced this pull request Dec 16, 2019
This code showcases the 3p image via an example, which
has been developed at #230

Signed-off-by: Valerio Cosentino <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants