更新时间:2023-12-05 13:47:16
如果您想使用 docker 的单个命令启动所有应用程序,您可以使用 docker-compose.
使用 docker-compose 仅用于测试目的或非常有限的生产基础设施.***的方法是将您的工件分别存放在不同的主机中.
请阅读这些内容以了解一些要点:
当您使用 docker-compose 时,所有服务都部署在同一台机器上,但每个服务都在一个容器中.并且只有一个进程在容器内运行.
因此,如果您进入一个容器(例如 nodejs 中的网络)并列出进程,您将看到如下内容:
nodejs .... 3001
进入另一个容器,如数据库 postgres:
postgres .... 5432
所以,如果nodejs web需要从内部连接到数据库,必须需要postgress数据库的ip而不是localhost,因为在nodejs容器内部,只有一个进程在localhost中运行:
本地主机 3001
因此,在 nodejs 容器中使用 localhost:5432
将不起作用.解决办法是使用postgres的ip代替localhost:10.10.100.101:5432
当我们有多个容器(docker-compose)并且它们之间存在依赖关系时,docker会建议我们:
总而言之,通过这些特性,docker创建了一种特殊网络"您的所有容器都可以安静地离开,没有 ips 的并发症!
仅用于测试、快速部署或在非常有限的生产环境中,您可以使用最新版本的 docker-compose(1.29.2) 和 docker 中的新功能.
在你的 docker-compose 末尾添加这个
网络:我的网络:司机:桥
这适用于您的所有容器
网络:- 我的网络
如果某些容器需要主机 ip,请使用 host.docker.internal 而不是 ip
环境:- DATABASE_HOST=host.docker.internal- API_BASE_URL=host.docker.internal:8020/api
最后在使用 host.docker.internal 的容器中添加:
extra_hosts:- host.docker.internal:host-gateway"
注意:这是在 ubuntu 上测试的,而不是在 mac 或 windows 上测试,因为没有机构在这些操作系统上部署其真实应用程序
在我看来,Docker 链接或网络是一种幻觉或欺骗,因为这仅适用于一台机器(开发或暂存),隐藏了我们和其他复杂主题的依赖关系,当您的应用程序离开您的笔记本电脑和转到您的真实服务器,供您的用户使用.
无论如何,如果您将 docker-compose 用于开发人员或实际用途,这些步骤将帮助您管理容器之间的 ip:
示例:
db:图像:mysql:5.7.22容器名称:db_ecommerce端口:- 5003:3306"环境:MYSQL_DATABASE:流明MYSQL_ROOT_PASSWORD:${DATABASE_PASSWORD}api-php:容器名称:api_ecommerce端口:- 8020:80"- 445:443"环境:- DATABASE_HOST=$MACHINE_HOST- DATABASE_USER=$DATABASE_USER- DATABASE_PASSWORD=$DATABASE_PASSWORD- ETC=$ETC网络反应:容器名称:react_ecommerce端口:- 3001:3000环境:- API_BASE_URL=$MACHINE_HOST:8020/api
docker-compose up -d
也在你的反应应用程序中使用 var 代替 package.json 中的代理读取你的 api 的 url:
process.env.REACT_APP_API_BASE_URL
检查this 了解如何读取环境变量来自反应应用.
您可以在此处找到如何使用MACHINE_HOST 变量及其使用的更详细步骤:
I have an API based on PHP (lumen) and an ecommerce based in React. Both of the work fine. The problem come when I try to make it work through Docker. I'd like to deploy the whole app running just a single command.
The problem is that the react app doesn't connect with the API.
I tried the answer of @Suman Kharel on this post
Proxying API Requests in Docker Container running react app
But it doesn't work. Any one know how can I sort it out?
Here is my repo on bitbucket.
https://bitbucket.org/mariogarciait/ecommerce-submodule/src/master/
Hopefully someone knows what i am doing wrong.
Thanks
If you want to start all of your apps with a single command with docker, yo could use docker-compose.
Using docker-compose is just for test purposes or a very limited production infrastructure. Best approach is to have your artifacts in different host each one.
Please read these to understand some points:
When you use docker-compose, all the services are deployed in the same machine, but each one in a container. And just one process is running inside a container.
So if you enter into a container (for example a web in nodejs) and list the process, you will see something like this:
nodejs .... 3001
And into another container like a database postgres:
postgres .... 5432
So, if the nodejs web needs to connect to the database, from inside, must need the ip instead localhost of postgress database because inside of nodejs container, just one process is running in the localhost:
localhost 3001
So, use localhost:5432
won't work inside of nodejs container. Solution is to use the ip of postgres instead localhost : 10.10.100.101:5432
When we have several containers (docker-compose) with dependencies between them, docker proposes us :
As a summary, with these features, docker create a kind of "special network" in which all your container leave in peace without complications of ips!
Just for test, quickly deploy or in a very limited production environment you could use a new feature in latest version of docker-compose(1.29.2) and docker.
Add this at the end of your docker-compose
networks:
mynetwork:
driver: bridge
this to all of your containers
networks:
- mynetwork
And if some container needs the host ip, use host.docker.internal instead of the ip
environment:
- DATABASE_HOST=host.docker.internal
- API_BASE_URL=host.docker.internal:8020/api
Finally in the containers that use host.docker.internal add this:
extra_hosts:
- "host.docker.internal:host-gateway"
Note: This was tested on ubuntu, not on mac or windows, because no bodies deploy its real applications on that operative systems
In my opinion, Docker links or networks are a kind of illusion or deceit because this only works in one machine (develop or staging), hiding dependencies from us and other complex topics, which are required when your apps leave your laptop and go to your real servers ready to be used by your users.
Anyway if you you will use docker-compose for developer or real purposes, these steps will help you to manage the ips between your containers:
Example:
db:
image: mysql:5.7.22
container_name: db_ecommerce
ports:
- "5003:3306"
environment:
MYSQL_DATABASE: lumen
MYSQL_ROOT_PASSWORD: ${DATABASE_PASSWORD}
api-php:
container_name: api_ecommerce
ports:
- "8020:80"
- "445:443"
environment:
- DATABASE_HOST=$MACHINE_HOST
- DATABASE_USER=$DATABASE_USER
- DATABASE_PASSWORD=$DATABASE_PASSWORD
- ETC=$ETC
web-react:
container_name: react_ecommerce
ports:
- 3001:3000
environment:
- API_BASE_URL=$MACHINE_HOST:8020/api
docker-compose up -d
Also in your react app read the url of your api using a var instead proxy in package.json:
process.env.REACT_APP_API_BASE_URL
Check this to learn how read environment variables from react app.
Here you can find a more detailed steps of how use MACHINE_HOST variable and its use: