René's URL Explorer Experiment


Title: GitHub - OpenGVLab/InternVL: [CVPR 2024 Oral] InternVL Family: A Pioneering Open-Source Alternative to GPT-4o. 接近GPT-4o表现的开源多模态对话模型

Open Graph Title: GitHub - OpenGVLab/InternVL: [CVPR 2024 Oral] InternVL Family: A Pioneering Open-Source Alternative to GPT-4o. 接近GPT-4o表现的开源多模态对话模型

X Title: GitHub - OpenGVLab/InternVL: [CVPR 2024 Oral] InternVL Family: A Pioneering Open-Source Alternative to GPT-4o. 接近GPT-4o表现的开源多模态对话模型

Description: [CVPR 2024 Oral] InternVL Family: A Pioneering Open-Source Alternative to GPT-4o. 接近GPT-4o表现的开源多模态对话模型 - OpenGVLab/InternVL

Open Graph Description: [CVPR 2024 Oral] InternVL Family: A Pioneering Open-Source Alternative to GPT-4o. 接近GPT-4o表现的开源多模态对话模型 - OpenGVLab/InternVL

X Description: [CVPR 2024 Oral] InternVL Family: A Pioneering Open-Source Alternative to GPT-4o. 接近GPT-4o表现的开源多模态对话模型 - OpenGVLab/InternVL

Opengraph URL: https://github.com/OpenGVLab/InternVL

X: @github

direct link

Domain: github.com

route-pattern/:user_id/:repository
route-controllerfiles
route-actiondisambiguate
fetch-noncev2:a4ac85e1-c5ef-2a27-7743-a3813f2b6e4e
current-catalog-service-hashf3abb0cc802f3d7b95fc8762b94bdcb13bf39634c40c357301c4aa1d67a256fb
request-idD0A0:12D78F:2CE5A36:3D8779C:6964E58D
html-safe-nonce5dc18a3618c3f20f3a1b0488ca79e452dc1a46ef85abd70b2eb36bf482054941
visitor-payloadeyJyZWZlcnJlciI6IiIsInJlcXVlc3RfaWQiOiJEMEEwOjEyRDc4RjoyQ0U1QTM2OjNEODc3OUM6Njk2NEU1OEQiLCJ2aXNpdG9yX2lkIjoiODM3MTE4ODk1MDE1MDQ3MzEwMSIsInJlZ2lvbl9lZGdlIjoiaWFkIiwicmVnaW9uX3JlbmRlciI6ImlhZCJ9
visitor-hmac476a053bfb2e98cfad1113c1ce823daba3ffd593f21126e8d6ae50147bca4565
hovercard-subject-tagrepository:721995615
github-keyboard-shortcutsrepository,copilot
google-site-verificationApib7-x98H0j5cPqHWwSMm6dNU4GmODRoqxLiDzdx9I
octolytics-urlhttps://collector.github.com/github/collect
analytics-location//
fb:app_id1401488693436528
apple-itunes-appapp-id=1477376905, app-argument=https://github.com/OpenGVLab/InternVL
twitter:imagehttps://opengraph.githubassets.com/4a8a74bb24aabb51c3b2068e45a7d10931c5812a859caab2e67c15473a014471/OpenGVLab/InternVL
twitter:cardsummary_large_image
og:imagehttps://opengraph.githubassets.com/4a8a74bb24aabb51c3b2068e45a7d10931c5812a859caab2e67c15473a014471/OpenGVLab/InternVL
og:image:alt[CVPR 2024 Oral] InternVL Family: A Pioneering Open-Source Alternative to GPT-4o. 接近GPT-4o表现的开源多模态对话模型 - OpenGVLab/InternVL
og:image:width1200
og:image:height600
og:site_nameGitHub
og:typeobject
hostnamegithub.com
expected-hostnamegithub.com
None3c30097417ecb9dfdab6b2e8bc7dc38d73e183d6ad48f94cb69e4a5daa2dbb87
turbo-cache-controlno-preview
go-importgithub.com/OpenGVLab/InternVL git https://github.com/OpenGVLab/InternVL.git
octolytics-dimension-user_id94522163
octolytics-dimension-user_loginOpenGVLab
octolytics-dimension-repository_id721995615
octolytics-dimension-repository_nwoOpenGVLab/InternVL
octolytics-dimension-repository_publictrue
octolytics-dimension-repository_is_forkfalse
octolytics-dimension-repository_network_root_id721995615
octolytics-dimension-repository_network_root_nwoOpenGVLab/InternVL
turbo-body-classeslogged-out env-production page-responsive
disable-turbofalse
browser-stats-urlhttps://api.github.com/_private/browser/stats
browser-errors-urlhttps://api.github.com/_private/browser/errors
release69fc54a84c74307369dba42af5401200531d116e
ui-targetfull
theme-color#1e2327
color-schemelight dark

Links:

Skip to contenthttps://github.com/OpenGVLab/InternVL#start-of-content
https://github.com/
Sign in https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2FOpenGVLab%2FInternVL
GitHub CopilotWrite better code with AIhttps://github.com/features/copilot
GitHub SparkBuild and deploy intelligent appshttps://github.com/features/spark
GitHub ModelsManage and compare promptshttps://github.com/features/models
MCP RegistryNewIntegrate external toolshttps://github.com/mcp
ActionsAutomate any workflowhttps://github.com/features/actions
CodespacesInstant dev environmentshttps://github.com/features/codespaces
IssuesPlan and track workhttps://github.com/features/issues
Code ReviewManage code changeshttps://github.com/features/code-review
GitHub Advanced SecurityFind and fix vulnerabilitieshttps://github.com/security/advanced-security
Code securitySecure your code as you buildhttps://github.com/security/advanced-security/code-security
Secret protectionStop leaks before they starthttps://github.com/security/advanced-security/secret-protection
Why GitHubhttps://github.com/why-github
Documentationhttps://docs.github.com
Bloghttps://github.blog
Changeloghttps://github.blog/changelog
Marketplacehttps://github.com/marketplace
View all featureshttps://github.com/features
Enterpriseshttps://github.com/enterprise
Small and medium teamshttps://github.com/team
Startupshttps://github.com/enterprise/startups
Nonprofitshttps://github.com/solutions/industry/nonprofits
App Modernizationhttps://github.com/solutions/use-case/app-modernization
DevSecOpshttps://github.com/solutions/use-case/devsecops
DevOpshttps://github.com/solutions/use-case/devops
CI/CDhttps://github.com/solutions/use-case/ci-cd
View all use caseshttps://github.com/solutions/use-case
Healthcarehttps://github.com/solutions/industry/healthcare
Financial serviceshttps://github.com/solutions/industry/financial-services
Manufacturinghttps://github.com/solutions/industry/manufacturing
Governmenthttps://github.com/solutions/industry/government
View all industrieshttps://github.com/solutions/industry
View all solutionshttps://github.com/solutions
AIhttps://github.com/resources/articles?topic=ai
Software Developmenthttps://github.com/resources/articles?topic=software-development
DevOpshttps://github.com/resources/articles?topic=devops
Securityhttps://github.com/resources/articles?topic=security
View all topicshttps://github.com/resources/articles
Customer storieshttps://github.com/customer-stories
Events & webinarshttps://github.com/resources/events
Ebooks & reportshttps://github.com/resources/whitepapers
Business insightshttps://github.com/solutions/executive-insights
GitHub Skillshttps://skills.github.com
Documentationhttps://docs.github.com
Customer supporthttps://support.github.com
Community forumhttps://github.com/orgs/community/discussions
Trust centerhttps://github.com/trust-center
Partnershttps://github.com/partners
GitHub SponsorsFund open source developershttps://github.com/sponsors
Security Labhttps://securitylab.github.com
Maintainer Communityhttps://maintainers.github.com
Acceleratorhttps://github.com/accelerator
Archive Programhttps://archiveprogram.github.com
Topicshttps://github.com/topics
Trendinghttps://github.com/trending
Collectionshttps://github.com/collections
Enterprise platformAI-powered developer platformhttps://github.com/enterprise
GitHub Advanced SecurityEnterprise-grade security featureshttps://github.com/security/advanced-security
Copilot for BusinessEnterprise-grade AI featureshttps://github.com/features/copilot/copilot-business
Premium SupportEnterprise-grade 24/7 supporthttps://github.com/premium-support
Pricinghttps://github.com/pricing
Search syntax tipshttps://docs.github.com/search-github/github-code-search/understanding-github-code-search-syntax
documentationhttps://docs.github.com/search-github/github-code-search/understanding-github-code-search-syntax
Sign in https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2FOpenGVLab%2FInternVL
Sign up https://github.com/signup?ref_cta=Sign+up&ref_loc=header+logged+out&ref_page=%2F%3Cuser-name%3E%2F%3Crepo-name%3E&source=header-repo&source_repo=OpenGVLab%2FInternVL
Reloadhttps://github.com/OpenGVLab/InternVL
Reloadhttps://github.com/OpenGVLab/InternVL
Reloadhttps://github.com/OpenGVLab/InternVL
OpenGVLab https://github.com/OpenGVLab
InternVLhttps://github.com/OpenGVLab/InternVL
Notifications https://github.com/login?return_to=%2FOpenGVLab%2FInternVL
Fork 753 https://github.com/login?return_to=%2FOpenGVLab%2FInternVL
Star 9.7k https://github.com/login?return_to=%2FOpenGVLab%2FInternVL
internvl.readthedocs.io/en/latest/https://internvl.readthedocs.io/en/latest/
MIT license https://github.com/OpenGVLab/InternVL/blob/main/LICENSE
9.7k stars https://github.com/OpenGVLab/InternVL/stargazers
753 forks https://github.com/OpenGVLab/InternVL/forks
Branches https://github.com/OpenGVLab/InternVL/branches
Tags https://github.com/OpenGVLab/InternVL/tags
Activity https://github.com/OpenGVLab/InternVL/activity
Star https://github.com/login?return_to=%2FOpenGVLab%2FInternVL
Notifications https://github.com/login?return_to=%2FOpenGVLab%2FInternVL
Code https://github.com/OpenGVLab/InternVL
Issues 289 https://github.com/OpenGVLab/InternVL/issues
Pull requests 6 https://github.com/OpenGVLab/InternVL/pulls
Actions https://github.com/OpenGVLab/InternVL/actions
Projects 0 https://github.com/OpenGVLab/InternVL/projects
Security Uh oh! There was an error while loading. Please reload this page. https://github.com/OpenGVLab/InternVL/security
Please reload this pagehttps://github.com/OpenGVLab/InternVL
Insights https://github.com/OpenGVLab/InternVL/pulse
Code https://github.com/OpenGVLab/InternVL
Issues https://github.com/OpenGVLab/InternVL/issues
Pull requests https://github.com/OpenGVLab/InternVL/pulls
Actions https://github.com/OpenGVLab/InternVL/actions
Projects https://github.com/OpenGVLab/InternVL/projects
Security https://github.com/OpenGVLab/InternVL/security
Insights https://github.com/OpenGVLab/InternVL/pulse
Brancheshttps://github.com/OpenGVLab/InternVL/branches
Tagshttps://github.com/OpenGVLab/InternVL/tags
https://github.com/OpenGVLab/InternVL/branches
https://github.com/OpenGVLab/InternVL/tags
261 Commitshttps://github.com/OpenGVLab/InternVL/commits/main/
https://github.com/OpenGVLab/InternVL/commits/main/
.githubhttps://github.com/OpenGVLab/InternVL/tree/main/.github
.githubhttps://github.com/OpenGVLab/InternVL/tree/main/.github
classificationhttps://github.com/OpenGVLab/InternVL/tree/main/classification
classificationhttps://github.com/OpenGVLab/InternVL/tree/main/classification
clip_benchmarkhttps://github.com/OpenGVLab/InternVL/tree/main/clip_benchmark
clip_benchmarkhttps://github.com/OpenGVLab/InternVL/tree/main/clip_benchmark
internvl_chathttps://github.com/OpenGVLab/InternVL/tree/main/internvl_chat
internvl_chathttps://github.com/OpenGVLab/InternVL/tree/main/internvl_chat
internvl_chat_gpt_osshttps://github.com/OpenGVLab/InternVL/tree/main/internvl_chat_gpt_oss
internvl_chat_gpt_osshttps://github.com/OpenGVLab/InternVL/tree/main/internvl_chat_gpt_oss
internvl_chat_llavahttps://github.com/OpenGVLab/InternVL/tree/main/internvl_chat_llava
internvl_chat_llavahttps://github.com/OpenGVLab/InternVL/tree/main/internvl_chat_llava
internvl_ghttps://github.com/OpenGVLab/InternVL/tree/main/internvl_g
internvl_ghttps://github.com/OpenGVLab/InternVL/tree/main/internvl_g
requirementshttps://github.com/OpenGVLab/InternVL/tree/main/requirements
requirementshttps://github.com/OpenGVLab/InternVL/tree/main/requirements
segmentationhttps://github.com/OpenGVLab/InternVL/tree/main/segmentation
segmentationhttps://github.com/OpenGVLab/InternVL/tree/main/segmentation
streamlit_demohttps://github.com/OpenGVLab/InternVL/tree/main/streamlit_demo
streamlit_demohttps://github.com/OpenGVLab/InternVL/tree/main/streamlit_demo
video_retrievalhttps://github.com/OpenGVLab/InternVL/tree/main/video_retrieval
video_retrievalhttps://github.com/OpenGVLab/InternVL/tree/main/video_retrieval
.flake8https://github.com/OpenGVLab/InternVL/blob/main/.flake8
.flake8https://github.com/OpenGVLab/InternVL/blob/main/.flake8
.gitignorehttps://github.com/OpenGVLab/InternVL/blob/main/.gitignore
.gitignorehttps://github.com/OpenGVLab/InternVL/blob/main/.gitignore
.isort.cfghttps://github.com/OpenGVLab/InternVL/blob/main/.isort.cfg
.isort.cfghttps://github.com/OpenGVLab/InternVL/blob/main/.isort.cfg
.pre-commit-config.yamlhttps://github.com/OpenGVLab/InternVL/blob/main/.pre-commit-config.yaml
.pre-commit-config.yamlhttps://github.com/OpenGVLab/InternVL/blob/main/.pre-commit-config.yaml
INSTALLATION.mdhttps://github.com/OpenGVLab/InternVL/blob/main/INSTALLATION.md
INSTALLATION.mdhttps://github.com/OpenGVLab/InternVL/blob/main/INSTALLATION.md
LICENSEhttps://github.com/OpenGVLab/InternVL/blob/main/LICENSE
LICENSEhttps://github.com/OpenGVLab/InternVL/blob/main/LICENSE
README.mdhttps://github.com/OpenGVLab/InternVL/blob/main/README.md
README.mdhttps://github.com/OpenGVLab/InternVL/blob/main/README.md
README_zh.mdhttps://github.com/OpenGVLab/InternVL/blob/main/README_zh.md
README_zh.mdhttps://github.com/OpenGVLab/InternVL/blob/main/README_zh.md
requirements.txthttps://github.com/OpenGVLab/InternVL/blob/main/requirements.txt
requirements.txthttps://github.com/OpenGVLab/InternVL/blob/main/requirements.txt
READMEhttps://github.com/OpenGVLab/InternVL
Contributinghttps://github.com/OpenGVLab/InternVL
MIT licensehttps://github.com/OpenGVLab/InternVL
https://github.com/OpenGVLab/InternVL#internvl-family-closing-the-gap-to-commercial-multimodal-models-with-open-source-suites--a-pioneering-open-source-alternative-to-gpt-5
https://private-user-images.githubusercontent.com/23737120/379689418-930e6814-8a9f-43e1-a284-118a5732daa4.png?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NjgyMjAzNDYsIm5iZiI6MTc2ODIyMDA0NiwicGF0aCI6Ii8yMzczNzEyMC8zNzk2ODk0MTgtOTMwZTY4MTQtOGE5Zi00M2UxLWEyODQtMTE4YTU3MzJkYWE0LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNjAxMTIlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjYwMTEyVDEyMTQwNlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWE4ZDAwMWIxZjdhY2JjMTcwNTU3OTZkMDNhY2U5ZTYzYjE3YTJmNmMwNmZjOThlZTVmMDE1ZTgzN2Q5ZGEyNmYmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.pslWok5uoJGWDlxh5a6d_8UhIlxTWoBYCNU2zBPQzbY
[🆕 Blog]https://internvl.github.io/blog/
[🤔 FAQs]https://internvl.readthedocs.io/en/latest/tutorials/faqs.html
[🗨️ Chat Demo]https://chat.intern-ai.org.cn/
[📖 Document]https://internvl.readthedocs.io/en/latest/
[🌐 API]https://internlm.intern-ai.org.cn/api/document
[🚀 Quick Start]https://github.com/OpenGVLab/InternVL#quick-start-with-huggingface
[🔥 InternVL3.5 Report]https://huggingface.co/papers/2508.18265
[📜 InternVL3.0 Report]https://huggingface.co/papers/2504.10479
[📜 InternVL2.5 MPO]https://huggingface.co/papers/2411.10442
[📜 InternVL2.5 Report]https://huggingface.co/papers/2412.05271
[📜 Mini-InternVL Paper]https://arxiv.org/abs/2410.16261
[📜 InternVL2 Blog]https://internvl.github.io/blog/2024-07-02-InternVL-2.0/
[📜 InternVL 1.5 Paper]https://huggingface.co/papers/2404.16821
[📜 InternVL 1.0 Paper]https://huggingface.co/papers/2312.14238
[📖 2.0 中文解读]https://zhuanlan.zhihu.com/p/706547971
[📖 1.5 中文解读]https://zhuanlan.zhihu.com/p/699439759
[📖 1.0 中文解读]https://zhuanlan.zhihu.com/p/702946079
Switch to the Chinese version (切换至中文版)https://github.com/OpenGVLab/InternVL/blob/main/README_zh.md
https://trendshift.io/repositories/9803
https://private-user-images.githubusercontent.com/23737120/350757075-bd62ab46-f0ea-40c6-ab10-7fde671716cc.png?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NjgyMjAzNDYsIm5iZiI6MTc2ODIyMDA0NiwicGF0aCI6Ii8yMzczNzEyMC8zNTA3NTcwNzUtYmQ2MmFiNDYtZjBlYS00MGM2LWFiMTAtN2ZkZTY3MTcxNmNjLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNjAxMTIlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjYwMTEyVDEyMTQwNlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWM3MTQ5MTkyN2NjNWZhZTcyZTljZDZkMzVmZGJiNjAyMjVmMTZmY2JhOWViMzEyNmEyNjg3NTZlNzg1NzAxMGEmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.02bj6A-rW0kMzPZjQ5UxQAhCM3XJeEzouQutyvH6x9I
https://camo.githubusercontent.com/db8e13e4a4bf2da539d9ae8e3d2314d3d1486a5fb074824e083dd71b2d350b85/68747470733a2f2f68756767696e67666163652e636f2f4f70656e47564c61622f496e7465726e564c335f352d323431422d413238422f7265736f6c76652f6d61696e2f696d616765732f706572666f726d616e63652e6a7067
https://github.com/OpenGVLab/InternVL#news-
InternVL3_5-GPT-OSS-20B-A4Bhttps://github.com/OpenGVLab/InternVL/tree/main/internvl_chat_gpt_oss
offline RL stagehttps://github.com/OpenGVLab/InternVL/blob/main/internvl_chat_gpt_oss/shell/internvl3_5_gpt_oss/internvl3_5_gpt_oss_20b_stage3_mpo.sh
online RL stagehttps://github.com/Weiyun1025/verl-internvl
MMPR-v1.2https://huggingface.co/datasets/OpenGVLab/MMPR-v1.2
MMPR-Tinyhttps://huggingface.co/datasets/OpenGVLab/MMPR-Tiny
InternVL3.5https://huggingface.co/papers/2508.18265
InternVL3.5-241B-A28Bhttps://huggingface.co/OpenGVLab/InternVL3_5-241B-A28B
InternVL3_5-GPT-OSS-20B-A4Bhttps://huggingface.co/OpenGVLab/InternVL3_5-GPT-OSS-20B-A4B-Preview
the GitHub formathttps://huggingface.co/OpenGVLab/InternVL3_5-GPT-OSS-20B-A4B-Preview#github-format
the HF formathttps://huggingface.co/OpenGVLab/InternVL3_5-GPT-OSS-20B-A4B-Preview#huggingface-format
data construction pipelinehttps://github.com/OpenGVLab/InternVL/tree/main/internvl_chat/tools/reasoning_data_pipeline
training scriptshttps://github.com/OpenGVLab/InternVL/tree/main/internvl_chat/shell/internvl3.0/mpo
MPOhttps://huggingface.co/papers/2411.10442
VisualPRMhttps://huggingface.co/papers/2503.10291
MPOhttps://github.com/OpenGVLab/InternVL/tree/main/internvl_chat/shell/internvl3.0/mpo_data_construction
VisualPRMhttps://github.com/OpenGVLab/InternVL/tree/main/internvl_chat/shell/internvl3.0/visualprm_data_construction
InternVL3https://huggingface.co/collections/OpenGVLab/internvl3-67f7f690be79c2fe9d74fe9d
perceptionhttps://rank.opencompass.org.cn/leaderboard-multimodal/?m=REALTIME
reasoning performancehttps://rank.opencompass.org.cn/leaderboard-multimodal-reasoning/?m=REALTIME
Variable Visual Position Encodinghttps://huggingface.co/papers/2412.09616
Native Multimodal Pre-Traininghttps://huggingface.co/papers/2504.10479
Mixed Preference Optimizationhttps://huggingface.co/papers/2411.10442
Multimodal Test-Time Scalinghttps://huggingface.co/papers/2503.10291
VisualPRMhttps://huggingface.co/OpenGVLab/VisualPRM-8B
VisualPRM400Khttps://huggingface.co/datasets/OpenGVLab/VisualPRM400K
paperhttps://huggingface.co/papers/2503.10291
project pagehttps://internvl.github.io/blog/2025-03-13-VisualPRM/
InternVL2.5-MPOhttps://internvl.github.io/blog/2024-12-20-InternVL-2.5-MPO/
Mixed Preference Optimizationhttps://huggingface.co/papers/2411.10442
MMPR-v1.1https://huggingface.co/datasets/OpenGVLab/MMPR-v1.1
HF linkhttps://huggingface.co/collections/OpenGVLab/internvl25-mpo-6753fed98cd828219b12f849
InternVL2/2.5https://github.com/PaddlePaddle/PaddleMIX/tree/develop/paddlemix/examples/internvl2
PaddleMIXhttps://github.com/PaddlePaddle/PaddleMIX
InternVL2.5https://huggingface.co/collections/OpenGVLab/internvl-25-673e1019b66e2218f68d7c1c
InternVL2_5-78Bhttps://huggingface.co/OpenGVLab/InternVL2_5-78B
HF linkhttps://huggingface.co/collections/OpenGVLab/internvl-25-673e1019b66e2218f68d7c1c
MMPRhttps://huggingface.co/datasets/OpenGVLab/MMPR
MPOhttps://github.com/OpenGVLab/InternVL/tree/main/internvl_chat/shell/internvl2.0_mpo
InternVL2-8B-MPOhttps://huggingface.co/OpenGVLab/InternVL2-8B-MPO
paperhttps://arxiv.org/abs/2411.10442
project pagehttps://internvl.github.io/blog/2024-11-14-InternVL-2.0-MPO/
documenthttps://internvl.readthedocs.io/en/latest/internvl2.0/preference_optimization.html
project pagehttps://github.com/OpenGVLab/InternVL/tree/main/internvl_chat/shell/mini_internvl
documenthttps://internvl.readthedocs.io/en/latest/internvl2.0/domain_adaptation.html
Chartmimichttps://chartmimic.github.io/
CharXivhttps://charxiv.github.io/#leaderboard
MLVUhttps://github.com/JUNJIE99/MLVU
InternVL2 serieshttps://huggingface.co/collections/OpenGVLab/internvl-20-667d3961ab5eb12c7ed1463e
Video-MMEhttps://github.com/BradyFU/Video-MME
DocVQAhttps://rrc.cvc.uab.es/?ch=17&com=evaluation&task=1
InfoVQAhttps://rrc.cvc.uab.es/?ch=17&com=evaluation&task=3
MM-NIAHhttps://github.com/OpenGVLab/MM-NIAH
ShareGPT-4ohttps://sharegpt4o.github.io/
lmdeployhttps://github.com/InternLM/lmdeploy
OpenGVLab/InternVL-Chat-V1-5-AWQhttps://huggingface.co/OpenGVLab/InternVL-Chat-V1-5-AWQ
text encoderhttps://huggingface.co/OpenGVLab/InternVL-14B-224px
MuLanhttps://github.com/mulanai/MuLan
HF linkhttps://huggingface.co/OpenGVLab/InternVL-Chat-V1-5
InternVL-Chat-V1-2-Plushttps://huggingface.co/OpenGVLab/InternVL-Chat-V1-2-Plus
bloghttps://internvl.github.io/blog/2024-02-21-InternVL-1.2/
bloghttps://internvl.github.io/blog/2024-02-21-InternVL-1.2/
SFT datahttps://github.com/OpenGVLab/InternVL/blob/main/internvl_chat#prepare-training-datasets
HuggingFacehttps://huggingface.co/OpenGVLab/InternVL-Chat-V1-2
herehttps://huggingface.co/OpenGVLab/InternVL-Chat-V1-1
customized mmcv/mmsegmentation/mmdetection codehttps://github.com/OpenGVLab/InternVL-MMDetSeg
https://github.com/OpenGVLab/InternVL#documents
https://github.com/OpenGVLab/InternVL#-get-started
Installation Guidehttps://internvl.readthedocs.io/en/latest/get_started/installation.html
requirements.txthttps://github.com/OpenGVLab/InternVL/blob/main/requirements.txt
Meta Filehttps://internvl.readthedocs.io/en/latest/get_started/chat_data_format.html#meta-file
Texthttps://internvl.readthedocs.io/en/latest/get_started/chat_data_format.html#pure-text-data
Single-Imagehttps://internvl.readthedocs.io/en/latest/get_started/chat_data_format.html#single-image-data
Multi-Imagehttps://internvl.readthedocs.io/en/latest/get_started/chat_data_format.html#multi-image-data
Videohttps://internvl.readthedocs.io/en/latest/get_started/chat_data_format.html#video-data
Streamlit Demohttps://internvl.readthedocs.io/en/latest/get_started/local_chat_demo.html#streamlit-demo
InternVL2.5 APIhttps://internlm.intern-ai.org.cn/api/document
Enhancing InternVL2 on COCO Caption Using LoRA Fine-Tuninghttps://internvl.readthedocs.io/en/latest/tutorials/coco_caption_finetune.html
https://github.com/OpenGVLab/InternVL#-internvl-family
Introhttps://internvl.readthedocs.io/en/latest/internvl3.0/introduction.html
Quick Starthttps://internvl.readthedocs.io/en/latest/internvl3.0/quick_start.html
Finetunehttps://internvl.readthedocs.io/en/latest/internvl3.0/finetune.html
Evaluatehttps://internvl.readthedocs.io/en/latest/internvl3.0/evaluation.html
Deployhttps://internvl.readthedocs.io/en/latest/internvl3.0/deployment.html
MPOhttps://internvl.readthedocs.io/en/latest/internvl3.0/preference_optimization.html
Introhttps://internvl.readthedocs.io/en/latest/internvl2.5/introduction.html
Quick Starthttps://internvl.readthedocs.io/en/latest/internvl2.5/quick_start.html
Finetunehttps://internvl.readthedocs.io/en/latest/internvl2.5/finetune.html
Evaluatehttps://internvl.readthedocs.io/en/latest/internvl2.5/evaluation.html
Deployhttps://internvl.readthedocs.io/en/latest/internvl2.5/deployment.html
MPOhttps://internvl.readthedocs.io/en/latest/internvl2.5/preference_optimization.html
Introhttps://internvl.readthedocs.io/en/latest/internvl2.0/introduction.html
Quick Starthttps://internvl.readthedocs.io/en/latest/internvl2.0/quick_start.html
Finetunehttps://internvl.readthedocs.io/en/latest/internvl2.0/finetune.html
Evaluatehttps://internvl.readthedocs.io/en/latest/internvl2.0/evaluation.html
Deployhttps://internvl.readthedocs.io/en/latest/internvl2.0/deployment.html
MPOhttps://internvl.readthedocs.io/en/latest/internvl2.0/preference_optimization.html
Introhttps://internvl.readthedocs.io/en/latest/internvl1.5/introduction.html
Quick Starthttps://internvl.readthedocs.io/en/latest/internvl1.5/quick_start.html
Finetunehttps://internvl.readthedocs.io/en/latest/internvl1.5/finetune.html
Evaluatehttps://internvl.readthedocs.io/en/latest/internvl1.5/evaluation.html
Deployhttps://internvl.readthedocs.io/en/latest/internvl1.5/deployment.html
Introhttps://internvl.readthedocs.io/en/latest/internvl1.2/introduction.html
Quick Starthttps://internvl.readthedocs.io/en/latest/internvl1.2/quick_start.html
Finetunehttps://internvl.readthedocs.io/en/latest/internvl1.2/finetune.html
Evaluatehttps://internvl.readthedocs.io/en/latest/internvl1.2/evaluation.html
Introhttps://internvl.readthedocs.io/en/latest/internvl1.1/introduction.html
Quick Starthttps://internvl.readthedocs.io/en/latest/internvl1.1/quick_start.html
Evaluationhttps://internvl.readthedocs.io/en/latest/internvl1.1/evaluation.html
Classificationhttps://internvl.readthedocs.io/en/latest/internvl1.0/classification.html
CLIP-Benchmarkhttps://internvl.readthedocs.io/en/latest/internvl1.0/clip_benchmark.html
Segmentationhttps://internvl.readthedocs.io/en/latest/internvl1.0/segmentation.html
Chat-LLaVAhttps://internvl.readthedocs.io/en/latest/internvl1.0/internvl_chat_llava.html
InternVL-Ghttps://internvl.readthedocs.io/en/latest/internvl1.0/internvl_g.html
https://github.com/OpenGVLab/InternVL#model-zoo
https://github.com/OpenGVLab/InternVL#multimodal-large-language-model-internvl-35
the GitHub formathttps://huggingface.co/OpenGVLab/InternVL3_5-241B-A28B
the HF formathttps://huggingface.co/OpenGVLab/InternVL3_5-241B-A28B-HF
custom2hfhttps://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/tools/internvl_custom2hf.py
hf2customhttps://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/tools/internvl_hf2custom.py
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-1B
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-1B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-2B
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-2B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-4B
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-4B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-8B
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-8B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-14B
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-14B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-38B
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-38B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-GPT-OSS-20B-A4B-Preview
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-GPT-OSS-20B-A4B-Preview
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-30B-A3B
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-30B-A3B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-241B-A28B
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-241B-A28B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-1B-HF
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-1B-HF
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-2B-HF
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-2B-HF
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-4B-HF
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-4B-HF
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-8B-HF
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-8B-HF
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-14B-HF
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-14B-HF
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-38B-HF
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-38B-HF
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-GPT-OSS-20B-A4B-Preview-HF
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-GPT-OSS-20B-A4B-Preview-HF
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-30B-A3B-HF
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-30B-A3B-HF
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3_5-241B-A28B-HF
🤖 linkhttps://www.modelscope.cn/models/OpenGVLab/InternVL3_5-241B-A28B-HF
https://github.com/OpenGVLab/InternVL#multimodal-large-language-model-internvl-30
InternViT‑300M‑448px‑V2_5https://huggingface.co/OpenGVLab/InternViT-300M-448px-V2_5
Qwen2.5‑0.5Bhttps://huggingface.co/Qwen/Qwen2.5-0.5B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3-1B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL3-1B
InternViT-300M-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-300M-448px-V2_5
Qwen2.5-1.5Bhttps://huggingface.co/Qwen/Qwen2.5-1.5B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3-2B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL3-2B
InternViT-300M-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-300M-448px-V2_5
Qwen2.5-7Bhttps://huggingface.co/Qwen/Qwen2.5-7B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3-8B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL3-8B
InternViT-300M-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-300M-448px-V2_5
internlm3-8b-instructhttps://huggingface.co/internlm/internlm3-8b-instruct
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3-9B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL3-9B
InternViT-300M-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-300M-448px-V2_5
Qwen2.5-14Bhttps://huggingface.co/Qwen/Qwen2.5-14B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3-14B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL3-14B
InternViT-6B-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-6B-448px-V2_5
Qwen2.5-32Bhttps://huggingface.co/Qwen/Qwen2.5-32B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3-38B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL3-38B
InternViT-6B-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-6B-448px-V2_5
Qwen2.5-72Bhttps://huggingface.co/Qwen/Qwen2.5-72B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL3-78B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL3-78B
https://github.com/OpenGVLab/InternVL#multimodal-large-language-model-internvl-25
InternViT‑300M‑448px‑V2_5https://huggingface.co/OpenGVLab/InternViT-300M-448px-V2_5
Qwen2.5‑0.5B‑Instructhttps://huggingface.co/Qwen/Qwen2.5-0.5B-Instruct
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2_5-1B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2_5-1B
InternViT-300M-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-300M-448px-V2_5
internlm2_5-1_8b-chathttps://huggingface.co/internlm/internlm2_5-1_8b-chat
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2_5-2B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2_5-2B
InternViT-300M-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-300M-448px-V2_5
Qwen2.5-3B-Instructhttps://huggingface.co/Qwen/Qwen2.5-3B-Instruct
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2_5-4B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2_5-4B
InternViT-300M-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-300M-448px-V2_5
internlm2_5-7b-chathttps://huggingface.co/internlm/internlm2_5-7b-chat
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2_5-8B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2_5-8B
InternViT-6B-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-6B-448px-V2_5
internlm2_5-20b-chathttps://huggingface.co/internlm/internlm2_5-20b-chat
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2_5-26B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2_5-26B
InternViT-6B-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-6B-448px-V2_5
Qwen2.5-32B-Instructhttps://huggingface.co/Qwen/Qwen2.5-32B-Instruct
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2_5-38B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2_5-38B
InternViT-6B-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-6B-448px-V2_5
Qwen2.5-72B-Instructhttps://huggingface.co/Qwen/Qwen2.5-72B-Instruct
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2_5-78B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2_5-78B
InternViT‑300M‑448px‑V2_5https://huggingface.co/OpenGVLab/InternViT-300M-448px-V2_5
Qwen2.5‑0.5B‑Instructhttps://huggingface.co/Qwen/Qwen2.5-0.5B-Instruct
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2_5-1B-MPO
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2_5-1B-MPO
InternViT-300M-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-300M-448px-V2_5
internlm2_5-1_8b-chathttps://huggingface.co/internlm/internlm2_5-1_8b-chat
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2_5-2B-MPO
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2_5-2B-MPO
InternViT-300M-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-300M-448px-V2_5
Qwen2.5-3B-Instructhttps://huggingface.co/Qwen/Qwen2.5-3B-Instruct
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2_5-4B-MPO
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2_5-4B-MPO
InternViT-300M-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-300M-448px-V2_5
internlm2_5-7b-chathttps://huggingface.co/internlm/internlm2_5-7b-chat
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2_5-8B-MPO
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2_5-8B-MPO
InternViT-6B-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-6B-448px-V2_5
internlm2_5-20b-chathttps://huggingface.co/internlm/internlm2_5-20b-chat
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2_5-26B-MPO
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2_5-26B-MPO
InternViT-6B-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-6B-448px-V2_5
Qwen2.5-32B-Instructhttps://huggingface.co/Qwen/Qwen2.5-32B-Instruct
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2_5-38B-MPO
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2_5-38B-MPO
InternViT-6B-448px-V2_5https://huggingface.co/OpenGVLab/InternViT-6B-448px-V2_5
Qwen2.5-72B-Instructhttps://huggingface.co/Qwen/Qwen2.5-72B-Instruct
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2_5-78B-MPO
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2_5-78B-MPO
https://github.com/OpenGVLab/InternVL#multimodal-large-language-model-internvl-20
InternViT-300M-448pxhttps://huggingface.co/OpenGVLab/InternViT-300M-448px
Qwen2-0.5B-Instructhttps://huggingface.co/Qwen/Qwen2-0.5B-Instruct
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2-1B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2-1B
InternViT-300M-448pxhttps://huggingface.co/OpenGVLab/InternViT-300M-448px
internlm2-chat-1-8bhttps://huggingface.co/internlm/internlm2-chat-1_8b
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2-2B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2-2B
InternViT-300M-448pxhttps://huggingface.co/OpenGVLab/InternViT-300M-448px
Phi‑3‑mini‑128k‑instructhttps://huggingface.co/microsoft/Phi-3-mini-128k-instruct
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2-4B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2-4B
InternViT-300M-448pxhttps://huggingface.co/OpenGVLab/InternViT-300M-448px
internlm2_5-7b-chathttps://huggingface.co/internlm/internlm2_5-7b-chat
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2-8B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2-8B
InternViT-6B-448px-V1-5https://huggingface.co/OpenGVLab/InternViT-6B-448px-V1-5
internlm2-chat-20bhttps://huggingface.co/internlm/internlm2-chat-20b
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2-26B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2-26B
InternViT‑6B‑448px‑V1‑5https://huggingface.co/OpenGVLab/InternViT-6B-448px-V1-5
Nous‑Hermes‑2‑Yi‑34Bhttps://huggingface.co/NousResearch/Nous-Hermes-2-Yi-34B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2-40B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2-40B
InternViT-6B-448px-V1-5https://huggingface.co/OpenGVLab/InternViT-6B-448px-V1-5
Hermes‑2‑Theta‑Llama‑3‑70Bhttps://huggingface.co/NousResearch/Hermes-2-Theta-Llama-3-70B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL2-Llama3-76B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL2-Llama3-76B
https://github.com/OpenGVLab/InternVL#multimodal-large-language-model-internvl-10-15
🤗 linkhttps://huggingface.co/OpenGVLab/Mini-InternVL-Chat-4B-V1-5
🤖 linkhttps://modelscope.cn/models/OpenGVLab/Mini-InternVL-Chat-4B-V1-5
🤗 linkhttps://huggingface.co/OpenGVLab/Mini-InternVL-Chat-2B-V1-5
🤖 linkhttps://modelscope.cn/models/OpenGVLab/Mini-InternVL-Chat-2B-V1-5
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL-Chat-V1-5
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL-Chat-V1-5
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL-Chat-V1-2-Plus
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL-Chat-V1-2-Plus
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL-Chat-V1-2
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL-Chat-V1-2
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL-Chat-V1-1
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL-Chat-V1-1
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL-Chat-ViT-6B-Vicuna-13B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL-Chat-ViT-6B-Vicuna-13B
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL-Chat-ViT-6B-Vicuna-7B
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL-Chat-ViT-6B-Vicuna-7B
https://github.com/OpenGVLab/InternVL#clip-like-model-internvl-10-25
🤗 linkhttps://huggingface.co/OpenGVLab/InternViT-300M-448px-V2_5
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternViT-300M-448px-V2_5
🤗 linkhttps://huggingface.co/OpenGVLab/InternViT-6B-448px-V2_5
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternViT-6B-448px-V2_5
🤗 linkhttps://huggingface.co/OpenGVLab/InternViT-300M-448px
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternViT-300M-448px
🤗 linkhttps://huggingface.co/OpenGVLab/InternViT-6B-448px-V1-5
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternViT-6B-448px-V1-5
🤗 linkhttps://huggingface.co/OpenGVLab/InternViT-6B-448px-V1-2
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternViT-6B-448px-V1-2
🤗 linkhttps://huggingface.co/OpenGVLab/InternViT-6B-448px-V1-0
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternViT-6B-448px-V1-0
🤗 linkhttps://huggingface.co/OpenGVLab/InternViT-6B-224px
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternViT-6B-224px
https://github.com/OpenGVLab/InternVL#vision-language-foundation-model-internvl-10
🤗 linkhttps://huggingface.co/OpenGVLab/InternVL-14B-224px
🤖 linkhttps://modelscope.cn/models/OpenGVLab/InternVL-14B-224px
https://github.com/OpenGVLab/InternVL#todo-list
https://github.com/OpenGVLab/InternVL#what-can-internvl-do
[see details]https://github.com/OpenGVLab/InternVL/blob/main/classification#-evaluation
[see details]https://github.com/OpenGVLab/InternVL/blob/main/segmentation#-evaluation
[see details]https://github.com/OpenGVLab/InternVL/blob/main/clip_benchmark#imagenet-variants-and-objectnet
[see details]https://github.com/OpenGVLab/InternVL/blob/main/clip_benchmark#multilingual-imagenet-1k
[see details]https://github.com/OpenGVLab/InternVL/blob/main/clip_benchmark#flickr30k--coco
[see details]https://github.com/OpenGVLab/InternVL/blob/main/clip_benchmark#flickr30k-cn--coco-cn
[see details]https://github.com/OpenGVLab/InternVL/blob/main/clip_benchmark#xtd
https://github.com/OpenGVLab/InternVL#quick-start-with-huggingface
https://github.com/OpenGVLab/InternVL#license
MIT licensehttps://github.com/OpenGVLab/InternVL/blob/main/LICENSE
https://github.com/OpenGVLab/InternVL#citation
https://github.com/OpenGVLab/InternVL#acknowledgement
OpenAI CLIPhttps://github.com/openai/CLIP
Open CLIPhttps://github.com/mlfoundations/open_clip
CLIP Benchmarkhttps://github.com/LAION-AI/CLIP_benchmark
EVAhttps://github.com/baaivision/EVA/tree/master
InternImagehttps://github.com/OpenGVLab/InternImage
ViT-Adapterhttps://github.com/czczup/ViT-Adapter
MMSegmentationhttps://github.com/open-mmlab/mmsegmentation
Transformershttps://github.com/huggingface/transformers
DINOv2https://github.com/facebookresearch/dinov2
BLIP-2https://github.com/salesforce/LAVIS/tree/main/projects/blip2
Qwen-VLhttps://github.com/QwenLM/Qwen-VL/tree/master/eval_mm
LLaVA-1.5https://github.com/haotian-liu/LLaVA
https://private-user-images.githubusercontent.com/25839884/394175119-f776df09-ebba-4fd5-80c2-fec4ff1518be.png?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NjgyMjAzNDYsIm5iZiI6MTc2ODIyMDA0NiwicGF0aCI6Ii8yNTgzOTg4NC8zOTQxNzUxMTktZjc3NmRmMDktZWJiYS00ZmQ1LTgwYzItZmVjNGZmMTUxOGJlLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNjAxMTIlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjYwMTEyVDEyMTQwNlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWMyMjRlOTgxMzAyN2RjZGQxNTJmZGI2YTk5OGNkNDE2YWY3ZTQ4M2VlNzM0MzRmMzRkOWYzZmVlYjVlZGVmZDAmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.KF_L7-0ur4--d-RHPZTTNUbZf8ndZO-Ffe46V2p1B9c
internvl.readthedocs.io/en/latest/https://internvl.readthedocs.io/en/latest/
image-classification https://github.com/topics/image-classification
gpt https://github.com/topics/gpt
multi-modal https://github.com/topics/multi-modal
semantic-segmentation https://github.com/topics/semantic-segmentation
video-classification https://github.com/topics/video-classification
image-text-retrieval https://github.com/topics/image-text-retrieval
llm https://github.com/topics/llm
vision-language-model https://github.com/topics/vision-language-model
gpt-4v https://github.com/topics/gpt-4v
vit-6b https://github.com/topics/vit-6b
vit-22b https://github.com/topics/vit-22b
gpt-4o https://github.com/topics/gpt-4o
Readme https://github.com/OpenGVLab/InternVL#readme-ov-file
MIT license https://github.com/OpenGVLab/InternVL#MIT-1-ov-file
Contributing https://github.com/OpenGVLab/InternVL#contributing-ov-file
Please reload this pagehttps://github.com/OpenGVLab/InternVL
Activityhttps://github.com/OpenGVLab/InternVL/activity
Custom propertieshttps://github.com/OpenGVLab/InternVL/custom-properties
9.7k starshttps://github.com/OpenGVLab/InternVL/stargazers
65 watchinghttps://github.com/OpenGVLab/InternVL/watchers
753 forkshttps://github.com/OpenGVLab/InternVL/forks
Report repository https://github.com/contact/report-content?content_url=https%3A%2F%2Fgithub.com%2FOpenGVLab%2FInternVL&report=OpenGVLab+%28user%29
Releases 6https://github.com/OpenGVLab/InternVL/releases
InternVL-Chat-V1.5.0 Latest May 8, 2024 https://github.com/OpenGVLab/InternVL/releases/tag/v1.5.0
+ 5 releaseshttps://github.com/OpenGVLab/InternVL/releases
Packages 0https://github.com/orgs/OpenGVLab/packages?repo_name=InternVL
Please reload this pagehttps://github.com/OpenGVLab/InternVL
Contributors 19https://github.com/OpenGVLab/InternVL/graphs/contributors
https://github.com/czczup
https://github.com/Weiyun1025
https://github.com/whai362
https://github.com/G-z-w
https://github.com/hjh0119
https://github.com/ErfeiCui
https://github.com/eltociear
https://github.com/lvhan028
https://github.com/Adushar
https://github.com/cg1177
https://github.com/opengvlab-admin
https://github.com/qishisuren123
https://github.com/vansin
https://github.com/luyao-cv
+ 5 contributorshttps://github.com/OpenGVLab/InternVL/graphs/contributors
Python 52.8% https://github.com/OpenGVLab/InternVL/search?l=python
Jupyter Notebook 40.2% https://github.com/OpenGVLab/InternVL/search?l=jupyter-notebook
Shell 6.7% https://github.com/OpenGVLab/InternVL/search?l=shell
JavaScript 0.2% https://github.com/OpenGVLab/InternVL/search?l=javascript
HTML 0.1% https://github.com/OpenGVLab/InternVL/search?l=html
Makefile 0.0% https://github.com/OpenGVLab/InternVL/search?l=makefile
https://github.com
Termshttps://docs.github.com/site-policy/github-terms/github-terms-of-service
Privacyhttps://docs.github.com/site-policy/privacy-policies/github-privacy-statement
Securityhttps://github.com/security
Statushttps://www.githubstatus.com/
Communityhttps://github.community/
Docshttps://docs.github.com/
Contacthttps://support.github.com?tags=dotcom-footer

Viewport: width=device-width


URLs of crawlers that visited me.