今天下午主要是在实验室熟悉隧道烟雾的实验数据收集
以及视频转图像还有图像差分的一些python文件
lehome规则已发布https://lehome-challenge.com/simulation-challenge
按照对应github链接配置下环境
==uv uv.lock pyproject.toml==能代替大部分conda pip与requirements的任务了,并且在某些地方要做的更加好。
基本环境
1. Clone the Repository
git clone https://github.com/lehome-official/lehome-challenge.gitcd lehome-challenge2. Install Dependencies with uv
uv syncThis will create a virtual environment and install all required dependencies.
这一步好久好久 要留有足够的硬盘空间
3. Clone and Configure IsaacLab
cd third_partygit clone https://github.com/lehome-official/IsaacLab.gitcd ..第三方的一般都在third_party,意为第三方插件
4. Install IsaacLab
Activate the virtual environment and install IsaacLab:
source .venv/bin/activate./third_party/IsaacLab/isaaclab.sh -i none5. Install LeHome Package
Finally, install the LeHome package in development mode:
uv pip install -e ./source/lehome资源和数据集
Download the required simulation assets (scenes, objects, robots) from HuggingFace:
# This creates the Assets/ directory with all required simulation resourceshf download lehome/asset_challenge --repo-type dataset --local-dir AssetsDownload Example Dataset
We provide demonstrations for four types of garments. Download from HuggingFace:
hf download lehome/dataset_challenge_merged --repo-type dataset --local-dir Datasets/exampleIf you need depth information or individual data for each garment. Download from HuggingFace:
hf download lehome/dataset_challenge --repo-type dataset --local-dir Datasets/example到数据采集这里 前面两章讲的是怎么使用主控臂采集 后面讲的是怎么使用键盘采集 给了一些cli命令还有对应的参数 但是我感觉键盘采集不是很现实
后面再看一下ebcloud要怎么部署吧 然后端口转发什么什么的