=== kuraudo1 is now known as kuraudo [15:35] holmanb: i think a restart is necessary. or at the very least we can restart the config and final modes? === dbungert1 is now known as dbungert [18:13] connor_k: yeah maybe - I didn't bother to ask why you are using vendor-data [18:14] faiqus: shouldn't be [18:15] faiqus: are you going to support the code that you're using? [18:16] holmanb: yeah its going to be supported by CAPA maintianers/open source community [18:16] are you a CAPA maintainer? [18:17] reviewer [18:17] ah [18:18] do you think this approach makes any sense? throwing out/ignoring the idea of restarting it or not. [18:19] faiqus: I mean, I wrote the code. What do you think I'm going to say :P [18:20] faiqus: restarting is _really_ undesireable for a number of reasons [18:20] faiqus: I just don't have the ability to test it [18:21] haha, no i meant having this two part approach where the user-data starts as a script off by fetching some code and that replaces the old user-data [18:21] ahhh, I see [18:21] uhh [18:22] that's the part that is confusing me. i think this stuff needs to be executed as something else and not necessarily as user-data? Maybe a boothook. I'm not sure though. [18:23] it would be much simpler to just write the whole datasource in Python - this approach was just a hack to prove that it is possible to do without the restart [18:23] and the person I was working with tested it [18:23] and it apparently worked - maybe something changed or this is an old version [18:24] an old version of your data source? [18:24] yeah, idk [18:25] I don't know if I ever put it in version control [18:25] maybe - i tried the same code richard had and didn't have any luck before experimenting on my own. [18:26] your code was in version control - let me find it [18:26] faiqus: I can walk you through some ideas or give you some changes to gather more info to debug the issue [18:26] but I'm a bit busy at the moment, I probably will not have time to dig into it today [18:26] https://github.com/canonical/cloud-init/commit/f2796cd8260b8f3f463aecdd19feb6524182aaf3 [18:26] -ubottu:#cloud-init- Commit f2796cd in canonical/cloud-init "feat: add POC datasource for Ec2 / Kubernetes" [18:27] no sweat. thanks for your support. maybe i can make the whole thing work in python. will credentials for AWS be present at the datasource time [18:28] faiqus: apparently, yes - the script that gets pulled down has them [18:28] faiqus: does the CAPA project have the ability to modify that script? [18:28] sure does [18:29] if you have an idea for a direction you want to go in please let me know and i can explore it. thanks again for all the guidance you're helping us get out of a crazy hole [18:30] faiqus: if you can modify what gets exposed by the IMDS server, what I'd suggest is to put the credentials that are exposed in that script into a configuration -> json / yaml / whatever [18:32] and have the datasource read from that source to fetch the "real" cloud-init user data? [18:33] cloud-init uses the python requests library [18:33] just query the IMDS, grab the configuration file (which has the credentials and whatever else you need), convert to a dict (json.loads() / yaml.safe_load() / whatever) and then implement the rest of that bash script in python [18:34] faiqus: yeah, basically [18:35] ok that sounds good. i think all of these instances use some sort of instance principal authentication so maybe that credentials will simply be somewhere [18:36] calling LOG.info() sends logs to /var/log/cloud-init.log by default (if you don't have a broken logging config) [18:36] and there is a helper called log_util.multi_log() if you need stuff to go to the console too (which ends up in /var/log/cloud-init-output.log) [18:37] > thanks again for all the guidance you're helping us get out of a crazy hole [18:37] happy to help [18:38] like I said, I don't have a ton of time to contribute but I'd like to see it get resolved and using cloud-init in a way that is more sustainable [18:41] faiqus: one more thing [18:42] the semantics of the datasource file are non-obvious, but the tl;dr is that when your code runs is defined by the datasources = [...] list [18:44] so if you have e.g. [18:44] datasources = [(DataSourceFooLocal), (sources.DEP_FILESYSTEM,)), (DataSourceFooNetwork, (sources.DEP_FILESYSTEM, sources.DEP_NETWORK))] [18:44] then DataSourceFooLocal will be used during the "local" stage (cloud-init-local.service) [18:44] and DataSourceFooNetwork will be used during the "network" stage (cloud-init-network.service) [18:46] faiqus: network isn't guaranteed to be available during local stage, but on some platforms a dhcp client is used to bring up a temp network to get the configuration [18:46] for more reading: https://docs.cloud-init.io/en/latest/explanation/boot.html [20:07] Hello! I'm having a problem with cloud-init 22.4.2 on a Debian AArch64 system. I'm pulling my configs from a self-hosted NoCloud provider, both user-data and vendor-data as multi-part MIME. Both configs have files with write_files entries and I've included the merge_how hack in them. However the write_file entries in my vendor-data files are being overridden. How do I fix this? [20:07] Any help would be greatly appreciated. [20:09] dean: "NoCloud provider", "multi-part MIME" - not sure exactly what you mean [20:09] are you using an ISO/filesystem to provide the config or HTTP/HTTPS? [20:09] I'm using an HTTP data source. [20:09] so then each of the configs is pulled separately [20:09] yes [20:09] so I don't see where the multi-part MIME comes in [20:10] if they're separately pulled then there is no multi-part [20:11] The user and vendor data configs are compiled from sets of #cloud-config files.into multi-part archives. [20:12] https://cloudinit.readthedocs.io/en/latest/explanation/format.html#mime-multi-part-archive [20:14] ok, haven't used that myself. So being overriden by what? [20:17] The write_file entries in the user-data files are being kept. The write_file entries in the vendor-data files are being dropped. [20:17] In theory, the merge_how hack is supposed to fix that. [20:19] not familiar with the merge_how hack, it's behaviour might differ between cloud-init versions. 22.4.2 is not exactly recent, is there no more recent c-i version available to use? [20:20] Unfortunately no. [20:20] I'd suggest you open a Github Issue and provide logs etc [20:20] Alright. [21:53] writing a datasource that uses aws APIs seems...strange does anyone have examples of datasoruces that reach out to services on the local cloud provider? i dont really know how im supposed to import the aws sdk either