/images/logo.png
A notebook for something

LLaMA_adapter for multimodel

1. clone the repo

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
git clone https://github.com/OpenGVLab/LLaMA-Adapter.git
cd LLaMA-Adapter/llama_adapter_v2_multimodal7b/
conda create -n llama_adapter_v2 python=3.8 -y
conda activate llama_adapter_v2
pip install -r requirements.txt
apt update && apt install -y libsm6 libxext6
pip install opencv-python

mkdir -p models/7B  # /home/jovyan/LLaMA-Adapter/llama_adapter_v2_multimodal7b/models/7B

# /home/pi/workspace/LLaMA-Adapter/llama_adapter_v2_multimodal7b/models/7B

2. download the original Llama-7b model

2.0 for LLaMA v1

1
2
3
curl -OL https://agi.gpt4.org/llama/LLaMA/tokenizer.model
curl -OL https://agi.gpt4.org/llama/LLaMA/7B/consolidated.00.pth
curl -OL https://agi.gpt4.org/llama/LLaMA/7B/params.json

2.1 get a token from https://huggingface.co/settings/tokens

it should be like “hf_avHIoqzdvupwMBZzhhREIMKZCUL12345”

Segment Anything

Model Card

1
2
3
4
5
6
7

git clone https://github.com/facebookresearch/segment-anything.git
cd segment-anything
# instsll
pip install -e .
# download the checkpoint
mkdir checkpoints; cd checkpoints; curl -OL https://dl.fbaipublicfiles.com/segment_anything/sam_vit_h_4b8939.pth; cd ..

Test run

1
2
3
4
5
6
7
8
9
from segment_anything import SamAutomaticMaskGenerator, sam_model_registry
sam = sam_model_registry["vit_h"](checkpoint="/home/jovyan/segment-anything/checkpoints/sam_vit_h_4b8939.pth")
sam = sam.to('cuda')
mask_generator = SamAutomaticMaskGenerator(sam)

from PIL import Image
import numpy as np
image = np.asarray(Image.open('/home/jovyan/segment-anything/1.jpg'))
masks = mask_generator.generate(image)

What does mask have?

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
annotation {
    "id"                    : int,              # Annotation id
    "segmentation"          : dict,             # Mask saved in COCO RLE format.
    "bbox"                  : [x, y, w, h],     # The box around the mask, in XYWH format
    "area"                  : int,              # The area in pixels of the mask
    "predicted_iou"         : float,            # The model's own prediction of the mask's quality
    "stability_score"       : float,            # A measure of the mask's quality
    "crop_box"              : [x, y, w, h],     # The crop of the image used to generate the mask, in XYWH format
    "point_coords"          : [[x, y]],         # The point coordinates input to the model to generate the mask
}

num_objects = len(masks)  # 56
masks[0].keys()  # ['segmentation', 'area', 'bbox', 'predicted_iou', 'point_coords', 'stability_score', 'crop_box']
masks[0]['segmentation'].shape  # (512, 384), True or False for each pixel, True means this pixel belong to this object
masks[0]['area']  # 80520, how big is this object
masks[0]['bbox']  # [25, 106, 323, 325], bounding box
masks[0]['predicted_iou']  # 1.0196424722671509
masks[0]['point_coords']  # [[138.0, 168.0]]
masks[0]['stability_score']  # 0.9537884593009949
masks[0]['crop_box']  # [0, 0, 384, 512]

Run a LLaMA 30B model with llama.cpp

Model Card

https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GGML

1
curl -OL https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Uncensored-GGML/resolve/main/Wizard-Vicuna-30B-Uncensored.ggmlv3.q4_1.bin

Test run

Q: “how can I get to the Mars, what are the options and analyze the cost for each option”

A: “There are several ways to get to Mars, but currently, none of them are affordable for an individual. The most feasible options are:

    1. Join a space mission: Government space agencies such as NASA or Roscosmos send missions to Mars regularly. You can try to join their crew as a scientist or astronaut. However, the competition is fierce, and you would need to have the right qualifications and experience.
    1. Private space travel: Companies like SpaceX and Blue Origin are working on developing private space travel options. They offer suborbital flights for tourists and plan to launch crewed missions to Mars in the future. However, these trips are currently only available to a select few and cost millions of dollars per person.
    1. Crowdfunding: You can try crowdfunding your trip to Mars. This option requires you to raise funds from multiple donors to cover the costs of the mission. However, this option is unlikely to be successful unless you have a strong social media following or a unique and compelling reason for going to Mars.

The cost of each option varies depending on the specifics of the mission. NASA’s budget for their next Mars mission is estimated at $2.7 billion. Private space travel options may cost millions of dollars per person, making them unaffordable for most people. Crowdfunding campaigns can raise tens or hundreds of thousands of dollars, but it would be difficult to cover the full cost of a Mars mission this way. Overall, getting to Mars is currently only feasible for government agencies and private companies with significant funding, making it unlikely for an individual to afford.”

Load Huggingface Model from local directory

Step 1.

1
pip install huggingface_hub

Step 2. download model

Here we download model ‘bert-base-uncased’ into ‘/home/pi/models/hf/’

1
2
from huggingface_hub import snapshot_download
snapshot_download(repo_id='meta-llama/Llama-2-7b', cache_dir='/home/pi/models/hf/')

The downloaded files are

1
models--meta-llama--Llama-2-7b
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
ls -R models--meta-llama--Llama-2-7b

models--meta-llama--Llama-2-7b/:
blobs  refs  snapshots

models--meta-llama--Llama-2-7b/blobs:
0f0bc7714e028d4eb77a300525b581013454f06b  510da489e04e615c1b50e690e03a62d3bbff9fd9                          abbcc199b2d1e4feb5d7e40c0bd67e1b0ce29e97
432012a5e6ec946e6c1cb318f256223889e3ab44  525dc349d71fe257fce4098c146446df6fef4247174f351381e4c3214af126f0  d67a91807d5879d193a694da57f28ff85092e92dc9fbef4888bd05e22b15ab75
4531f05cde0f2f2cb2d44055cf08e1d467d40196  6523f76675b50e9cf3a57d1fb135189abcffe1c7
51089e27e6764fb9f72c06a0f3710699fb6c9448  9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347

models--meta-llama--Llama-2-7b/refs:
main

models--meta-llama--Llama-2-7b/snapshots:
365ffa8f1a6c455d3e2028ae658236b4b85ba824

models--meta-llama--Llama-2-7b/snapshots/365ffa8f1a6c455d3e2028ae658236b4b85ba824:
checklist.chk  consolidated.00.pth  LICENSE.txt  params.json  README.md  Responsible-Use-Guide.pdf  tokenizer_checklist.chk  tokenizer.model  USE_POLICY.md

If need to download the raw files, use “local_dir”

1
snapshot_download(repo_id='meta-llama/Llama-2-7b', local_dir='/home/pi/models/7b')

The downloaded files are

Remote Jupyter Notebook

Setup Remote Jupyter Notebook

  • A: my laptop, will be used to get access to a notebook running on B, through a web browser
  • B: my desktop, runs conda as the python env manager, with address pi@10.0.0.101
  • B is a powerful machines with GPU. We start a jupyter notebook on B and get access to it from A

Step1.

From A:

1
ssh -N -L 8888:127.0.0.1:8888 pi@10.0.0.101

Step2.

From B: