Skip to content

Users/hc/gnntest#20

Open
nicelhc13 wants to merge 51 commits intomasterfrom
users/hc/gnntest
Open

Users/hc/gnntest#20
nicelhc13 wants to merge 51 commits intomasterfrom
users/hc/gnntest

Conversation

@nicelhc13
Copy link

CI/CD tests were broken due to this PR.

@nicelhc13
Copy link
Author

@patrickkenney9801 I put a log about a file stat in CuSP (please search "File size: XX" on the log). On my end, tester.csgr was 136. But, this CI/CD test printed 128, which was different from my end. And it failed on OfflineGraph's initialization list, which sets ifstream and ofstream. So my current hypothesis is due to some unknown reason, input files on the server running the CI/CD test were broken. I am not 100% for sure but the log looks like it. If you can easily access the machine, could you please just do ls -alh on the input graph path?

@patrickkenney9801
Copy link

@patrickkenney9801 I put a log about a file stat in CuSP (please search "File size: XX" on the log). On my end, tester.csgr was 136. But, this CI/CD test printed 128, which was different from my end. And it failed on OfflineGraph's initialization list, which sets ifstream and ofstream. So my current hypothesis is due to some unknown reason, input files on the server running the CI/CD test were broken. I am not 100% for sure but the log looks like it. If you can easily access the machine, could you please just do ls -alh on the input graph path?

I can only check the file paths when the ci job is running, I'll check later today

@nicelhc13
Copy link
Author

I confirmed that the path exists FYI if this is what you meant.

@patrickkenney9801
Copy link

@patrickkenney9801 I put a log about a file stat in CuSP (please search "File size: XX" on the log). On my end, tester.csgr was 136. But, this CI/CD test printed 128, which was different from my end. And it failed on OfflineGraph's initialization list, which sets ifstream and ofstream. So my current hypothesis is due to some unknown reason, input files on the server running the CI/CD test were broken. I am not 100% for sure but the log looks like it. If you can easily access the machine, could you please just do ls -alh on the input graph path?

I can only check the file paths when the ci job is running, I'll check later today

To follow up on this, ci has been extremely bloated lately, maybe this weekend I will be able to attempt this check

@nicelhc13
Copy link
Author

Lemme know when you work on it. One thing that I can try is to add new input graph files not through lfs and check if they work. Now I am 90% sure that this issue is related to files.

@patrickkenney9801
Copy link

Lemme know when you work on it. One thing that I can try is to add new input graph files not through lfs and check if they work. Now I am 90% sure that this issue is related to files.

I kicked it off and it worked just fine, the second time it had a hang in the tests: https://github.com/utcs-scea/Galois/actions/runs/8532556268/job/23566845847?pr=20

Kicking it off a third

@patrickkenney9801
Copy link

Lemme know when you work on it. One thing that I can try is to add new input graph files not through lfs and check if they work. Now I am 90% sure that this issue is related to files.

I kicked it off and it worked just fine, the second time it had a hang in the tests: https://github.com/utcs-scea/Galois/actions/runs/8532556268/job/23566845847?pr=20

Kicking it off a third

Honestly I think that galois just hangs sometimes and that's a fact of life

@tewaro
Copy link

tewaro commented Apr 11, 2024

@patrickkenney9801 @nicelhc13 Where are we with these tests?

@patrickkenney9801
Copy link

@patrickkenney9801 @nicelhc13 Where are we with these tests?

I say we merge it and accept some flakes

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants