容器镜像构建-Hadoop基础镜像

news/2026/1/20 16:58:18/文章来源:https://www.cnblogs.com/ysxz/p/19507645

说明

有安全加固或其他需求的,可自定义基础镜像用于Hadoop镜像构建。Hadoop基础镜像用于构建Hadoop镜像。

目录结构

.
├── Dockerfile.ubuntu25 # 基于Ubuntu25.10,python3,JDK11.
├── README.md
└── scripts # 已改造适配python3。├── envtoconf.py├── krb5.conf├── starter.sh└── transformation.py

构建基础镜像

参考Hadoop官方说明
在本目录进行构建。因Hadoop官方项目长期未更新Dockerfile,已做部分改动。请自行对比官方原代码判断是否要调整。
请在联网环境构建镜像,推荐美国网络,规避软件下载异常问题。

JDK版本选择说明

参考Hadoop官方说明,JDK8支持编译和运行hadoop当前所有版本,Hadoop v3.3及更高版本可使用JDK11运行。
基础镜像默认使用JDK8,若使用JDK11,需编辑Dockerfile修改JDK版本。

执行构建命令

命令格式

docker build -t hadoop:runner-v1 -f Dockerfile.ubuntu25 .

Dockerfile.ubuntu25内容如下

# 1. 基础镜像改为 Ubuntu 25.10
FROM ubuntu:25.10# 设置环境变量
ENV DEBIAN_FRONTEND=noninteractive# 2. 安全删除内置 UID 1000 用户
# 使用脚本逻辑确保即使 1000 用户名不是 'ubuntu' 也能正确删除
RUN if getent passwd 1000; then \userdel -f $(getent passwd 1000 | cut -d: -f1); \fi && \if getent group 1000; then \groupdel $(getent group 1000 | cut -d: -f1); \fi# 3. 更新包列表并安装
RUN apt-get update && apt-get install -y \sudo \python3-pip \python3-venv \python-is-python3 \wget \curl \netcat-openbsd \jq \openjdk-11-jdk \krb5-user \ca-certificates \&& rm -rf /var/lib/apt/lists/*# 4. 安装 Robotframework
# python-is-python3 会把 /usr/bin/python 指向 python3
RUN pip install --break-system-packages robotframework# 5. 下载 dumb-init (更新到较新版本)
RUN wget -O /usr/local/bin/dumb-init https://github.com/Yelp/dumb-init/releases/download/v1.2.5/dumb-init_1.2.5_x86_64 && \chmod +x /usr/local/bin/dumb-init# 6. 权限与目录初始化
RUN mkdir -p /opt/security/keytabs && chmod -R a+wr /opt/security/keytabs 
ADD https://repo.maven.apache.org/maven2/org/jboss/byteman/byteman/4.0.4/byteman-4.0.4.jar /opt/byteman.jar
RUN chmod o+r /opt/byteman.jar# 7. 安装 Async-profiler
RUN mkdir -p /opt/profiler && \cd /opt/profiler && \curl -L https://github.com/jvm-profiling-tools/async-profiler/releases/download/v1.5/async-profiler-1.5-linux-x64.tar.gz | tar xvz# 8. 环境变量
ENV JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
ENV PATH=$PATH:/opt/hadoop/bin
ENV PYTHONPATH=/opt/scripts# 9. 创建 Hadoop 用户 (UID 1000)
RUN groupadd --gid 1000 hadoop && \useradd --uid 1000 hadoop --gid 1000 --home /opt/hadoop --create-home --shell /bin/bash# 允许 hadoop 用户使用 sudo (krb5.conf 修改需要)
RUN echo "hadoop ALL=(ALL) NOPASSWD: ALL" >> /etc/sudoers# 10. 添加脚本
# 注意:先创建目录并赋权,再 ADD,防止权限丢失
RUN mkdir -p /opt/scripts
ADD scripts /opt/scripts/
# 赋予执行权限
RUN chmod +x /opt/scripts/*.sh /opt/scripts/*.py# 11. 目录准备
RUN mkdir -p /opt/hadoop /var/log/hadoop && \chmod 1777 /opt/hadoop /var/log/hadoop && \chown -R hadoop:hadoop /opt/hadoopENV HADOOP_LOG_DIR=/var/log/hadoop
ENV HADOOP_CONF_DIR=/opt/hadoopWORKDIR /opt/hadoop
RUN mkdir /data && chmod 1777 /data && chown hadoop:hadoop /data# 12. 运行配置
USER hadoop
# 确保指向正确的脚本路径
ENTRYPOINT ["/usr/local/bin/dumb-init", "--", "/opt/scripts/starter.sh"]

scripts目录下
bashrc内容如下

#!/usr/bin/env bash
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
PS1="\u@\h: \w> "

envtoconf.py内容如下,

#!/usr/bin/python3
# -*- coding: utf-8 -*-"""convert environment variables to config"""import os
import re
import argparse
import sys
import transformationclass Simple(object):"""Simple conversion"""def __init__(self, args):parser = argparse.ArgumentParser()parser.add_argument("--destination", help="Destination directory", required=True)self.args = parser.parse_args(args=args)self.known_formats = ['xml', 'properties', 'yaml', 'yml', 'env', "sh", "cfg", 'conf']self.output_dir = self.args.destinationself.excluded_envs = ['HADOOP_CONF_DIR']self.configurables = {}def destination_file_path(self, name, extension):"""destination file path"""return os.path.join(self.output_dir, "{}.{}".format(name, extension))def write_env_var(self, name, extension, key, value):"""Write environment variables"""# 显式指定 utf-8 编码file_path = self.destination_file_path(name, extension) + ".raw"with open(file_path, "a", encoding='utf-8') as myfile:myfile.write("{}: {}\n".format(key, value))def process_envs(self):"""Process environment variables"""# 对环境变量进行排序,保证执行的可预测性for key in sorted(os.environ.keys()):if key in self.excluded_envs:continuepattern = re.compile("[_\\.]")parts = pattern.split(key)if not parts:continueextension = Nonename = parts[0].lower()if len(parts) > 1:extension = parts[1].lower()# 默认配置 key 截取config_key = key[len(name) + len(extension) + 2:].strip()if extension and "!" in extension:splitted = extension.split("!")extension = splitted[0]fmt = splitted[1]config_key = key[len(name) + len(extension) + len(fmt) + 3:].strip()else:fmt = extensionif extension and extension in self.known_formats:if name not in self.configurables:# 初始化文件with open(self.destination_file_path(name, extension) + ".raw", "w", encoding='utf-8') as myfile:myfile.write("")self.configurables[name] = (extension, fmt)self.write_env_var(name, extension, config_key, os.environ[key])else:# 修复逻辑:处理不带 format 标记但匹配已定义 configurable 的变量for configurable_name in self.configurables:if key.lower().startswith(configurable_name.lower()):ext, _ = self.configurables[configurable_name] # 修正:只提取 extensionself.write_env_var(configurable_name,ext,key[len(configurable_name) + 1:],os.environ[key])def transform(self):"""transform"""for configurable_name in sorted(self.configurables.keys()):name = configurable_nameextension, fmt = self.configurables[name]destination_path = self.destination_file_path(name, extension)if not os.path.exists(destination_path + ".raw"):continuewith open(destination_path + ".raw", "r", encoding='utf-8') as myfile:content = myfile.read()# 调用 transformation.py 中的函数try:transformer_func = getattr(transformation, "to_" + fmt)content = transformer_func(content)with open(destination_path, "w", encoding='utf-8') as myfile:myfile.write(content)except AttributeError:print("Error: No transformer found for format '{}'".format(fmt), file=sys.stderr)def main(self):self.process_envs()self.transform()def main():Simple(sys.argv[1:]).main()if __name__ == '__main__':main()

krb5.conf内容如下

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.[logging]default = FILE:/var/log/krb5libs.logkdc = FILE:/var/log/krb5kdc.logadmin_server = FILE:/var/log/kadmind.log[libdefaults]dns_canonicalize_hostname = falsedns_lookup_realm = falseticket_lifetime = 24hrenew_lifetime = 7dforwardable = truerdns = falsedefault_realm = EXAMPLE.COM[realms]EXAMPLE.COM = {kdc = SERVERadmin_server = SERVER}[domain_realm].example.com = EXAMPLE.COMexample.com = EXAMPLE.COM

starter.sh内容如下

#!/usr/bin/env bash
set -e# 获取脚本所在目录
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"# 1. 等待逻辑
if [ -n "$SLEEP_SECONDS" ]; thenecho "Sleeping for $SLEEP_SECONDS seconds"sleep "$SLEEP_SECONDS"
fi# 2. 端口等待逻辑 (WAITFOR)
if [ -n "$WAITFOR" ]; thenecho "Waiting for the service $WAITFOR"WAITFOR_HOST=$(printf "%s\n" "$WAITFOR"| cut -d : -f 1)WAITFOR_PORT=$(printf "%s\n" "$WAITFOR"| cut -d : -f 2)# 修正 seq 的使用方式,确保兼容性for i in $(seq ${WAITFOR_TIMEOUT:-300} -1 0) ; doset +enc -z "$WAITFOR_HOST" "$WAITFOR_PORT" > /dev/null 2>&1result=$?set -eif [ $result -eq 0 ] ; thenbreakfisleep 1doneif [ "$i" -eq 0 ]; thenecho "Waiting for service $WAITFOR is timed out." >&2exit 1fi
fi# 3. Kerberos 设置
if [ -n "$KERBEROS_ENABLED" ]; thenecho "Setting up kerberos!!"KERBEROS_SERVER=${KERBEROS_SERVER:-krb5}ISSUER_SERVER=${ISSUER_SERVER:-$KERBEROS_SERVER:8081}echo "KDC ISSUER_SERVER => $ISSUER_SERVER"if [ -n "$SLEEP_SECONDS" ]; then# 修正了之前的 $(SLEEP_SECONDS) 语法错误echo "Sleeping for ${SLEEP_SECONDS} seconds"sleep "${SLEEP_SECONDS}"fiKEYTAB_DIR=${KEYTAB_DIR:-/etc/security/keytabs}while true; doset +eSTATUS=$(curl -s -o /dev/null -w '%{http_code}' http://"$ISSUER_SERVER"/keytab/test/test)set -eif [ "$STATUS" -eq 200 ]; thenecho "Got 200, KDC service ready!!"breakelseecho "Got $STATUS :( KDC service not ready yet..."fisleep 5doneHOST_NAME=$(hostname -f)export HOST_NAMEfor NAME in ${KERBEROS_KEYTABS}; doecho "Download $NAME/$HOST_NAME@EXAMPLE.COM keytab file to $KEYTAB_DIR/$NAME.keytab"wget -q "http://$ISSUER_SERVER/keytab/$HOST_NAME/$NAME" -O "$KEYTAB_DIR/$NAME.keytab"klist -kt "$KEYTAB_DIR/$NAME.keytab"done# 适配 Ubuntu 的配置文件路径sed "s/SERVER/$KERBEROS_SERVER/g" "$DIR"/krb5.conf | sudo tee /etc/krb5.conf > /dev/null
fi# 4. 权限修复 (针对 Docker 挂载卷)
sudo chmod o+rwx /data# 5. 调用 Python 3 转换配置 (关键修改)
python3 "$DIR"/envtoconf.py --destination "${HADOOP_CONF_DIR:-/opt/hadoop/etc/hadoop}"# 6. Hadoop/Ozone 初始化逻辑 (保持原样,但确保路径正确)
if [ -n "$ENSURE_NAMENODE_DIR" ]; thenCLUSTERID_OPTS=""if [ -n "$ENSURE_NAMENODE_CLUSTERID" ]; thenCLUSTERID_OPTS="-clusterid $ENSURE_NAMENODE_CLUSTERID"fiif [ ! -d "$ENSURE_NAMENODE_DIR" ]; then/opt/hadoop/bin/hdfs namenode -format -force $CLUSTERID_OPTSfi
fiif [ -n "$ENSURE_STANDBY_NAMENODE_DIR" ]; thenif [ ! -d "$ENSURE_STANDBY_NAMENODE_DIR" ]; then/opt/hadoop/bin/hdfs namenode -bootstrapStandbyfi
fi# Ozone 相关的初始化
if [ -n "$ENSURE_SCM_INITIALIZED" ]; thenif [ ! -f "$ENSURE_SCM_INITIALIZED" ]; then/opt/hadoop/bin/ozone scm --init || /opt/hadoop/bin/ozone scm -initfi
fiif [ -n "$ENSURE_OM_INITIALIZED" ]; thenif [ ! -f "$ENSURE_OM_INITIALIZED" ]; then/opt/hadoop/bin/ozone om --init || /opt/hadoop/bin/ozone om -createObjectStorefi
fi# 7. Byteman 注入
if [ -n "$BYTEMAN_SCRIPT" ] || [ -n "$BYTEMAN_SCRIPT_URL" ]; then# 确保 BYTEMAN_DIR 已定义BYTEMAN_DIR=${BYTEMAN_DIR:-/opt/profiler} export PATH=$PATH:$BYTEMAN_DIR/binif [ -n "$BYTEMAN_SCRIPT_URL" ]; thensudo wget -q $BYTEMAN_SCRIPT_URL -O /tmp/byteman.btmexport BYTEMAN_SCRIPT=/tmp/byteman.btmfiif [ ! -f "$BYTEMAN_SCRIPT" ]; thenecho "ERROR: The defined $BYTEMAN_SCRIPT does not exist!!!"exit 1fiAGENT_STRING="-javaagent:/opt/byteman.jar=script:$BYTEMAN_SCRIPT"export HADOOP_OPTS="$AGENT_STRING $HADOOP_OPTS"echo "Process is instrumented with $AGENT_STRING"
fi# 执行 CMD 传入的命令
exec "$@"

transformation.py内容如下

#!/usr/bin/python3
# -*- coding: utf-8 -*-"""This module transform properties into different format"""def render_yaml(yaml_root, prefix=""):"""render yaml"""result = ""if isinstance(yaml_root, dict):if prefix:result += "\n"# 兼容 Py3: 字典遍历建议排序以保证生成文件的一致性for key in sorted(yaml_root.keys()):result += "{}{}: {}".format(prefix, key, render_yaml(yaml_root[key], prefix + "   "))elif isinstance(yaml_root, list):result += "\n"for item in yaml_root:result += prefix + " - " + render_yaml(item, prefix + " ")else:result += "{}\n".format(yaml_root)return resultdef to_yaml(content):"""transform to yaml"""props = process_properties(content)keys = sorted(props.keys()) # 排序保证输出稳定yaml_props = {}for key in keys:parts = key.split(".")node = yaml_propsprev_part = Noneparent_node = {}for part in parts[:-1]:if part.isdigit():idx = int(part)if isinstance(node, dict):parent_node[prev_part] = []node = parent_node[prev_part]while len(node) <= idx:node.append({})parent_node = nodenode = node[idx] # 修正了原代码的 int(node) 错误else:if part not in node:node[part] = {}parent_node = nodenode = node[part]prev_part = partlast_part = parts[-1]if last_part.isdigit():idx = int(last_part)if isinstance(node, dict):parent_node[prev_part] = []node = parent_node[prev_part]node.append(props[key])else:node[last_part] = props[key]return render_yaml(yaml_props)def to_yml(content):return to_yaml(content)def to_properties(content):result = ""props = process_properties(content)for key, val in sorted(props.items()): # 增加 items() 并排序result += "{}: {}\n".format(key, val)return resultdef to_env(content):result = ""props = process_properties(content)for key, val in sorted(props.items()): # 修正:增加 .items()result += "{}={}\n".format(key, val)return resultdef to_sh(content):result = ""props = process_properties(content)for key, val in sorted(props.items()): # 修正:增加 .items()result += "export {}=\"{}\"\n".format(key, val)return resultdef to_cfg(content):result = ""props = process_properties(content)for key, val in sorted(props.items()): # 修正:增加 .items()result += "{}={}\n".format(key, val)return resultdef to_conf(content):result = ""props = process_properties(content)for key, val in sorted(props.items()): # 修正:增加 .items()result += "export {}={}\n".format(key, val)return resultdef to_xml(content):result = "<configuration>\n"props = process_properties(content)for key in sorted(props.keys()):result += "<property><name>{0}</name><value>{1}</value></property>\n". \format(key, props[key])result += "</configuration>"return resultdef process_properties(content, sep=': ', comment_char='#'):props = {}if not content:return propsfor line in content.split("\n"):sline = line.strip()if sline and not sline.startswith(comment_char):if sep in sline:key_value = sline.split(sep)key = key_value[0].strip()value = sep.join(key_value[1:]).strip().strip('"')props[key] = valuereturn props

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mzph.cn/news/1189886.shtml

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

Manim数学动画框架:从零开始创建专业级数学可视化内容

Manim数学动画框架&#xff1a;从零开始创建专业级数学可视化内容 【免费下载链接】manim A community-maintained Python framework for creating mathematical animations. 项目地址: https://gitcode.com/GitHub_Trending/man/manim 还在为复杂的数学概念难以直观展…

从“驾驶席”到“领航员”:知识服务者的时代角色跃迁

当汽车产业向我们展示着由“人类驾驶”到“系统主导”的清晰路径时&#xff0c;我们不禁反思&#xff1a;以人为核心的知识服务行业&#xff0c;其效率与体验的进化方向何在&#xff1f;我们观察到&#xff0c;许多知识创作者与教育者正陷入一种“勤奋的损耗”&#xff1a;将大…

kingbase 常用命令

常用命令 ksql----连接数据库的客户端,类似于mysql命令或者sqlplus命令。 找到ksql命令,并登录数据库 [root@mail ~]# find / -name ksql /app/kingbase/ES/V8/KESRealPro/V008R006C008M020B0025/Server/bin/ksql /a…

【实战项目】 金融领域大语言模型的微调与风险分析应用

运行效果:https://lunwen.yeel.cn/view.php?id=5901 金融领域大语言模型的微调与风险分析应用摘要:随着金融行业的快速发展,大语言模型在金融领域的应用日益广泛。本文针对金融领域大语言模型,探讨了其微调技术及…

杭州拼多多代运营公司有哪些?一文了解杭州代运营市场现状 - 前沿公社

随着 拼多多平台用户规模持续增长,越来越多品牌选择借助专业代运营团队提升店铺流量、转化率和整体销售表现。杭州作为中国的 “电商之都”,凭借其完善的数字经济生态与人才集聚优势,已经形成了成熟且活跃的电商代运…

智能内容解锁工具:免费阅读付费内容的完整指南

智能内容解锁工具&#xff1a;免费阅读付费内容的完整指南 【免费下载链接】bypass-paywalls-chrome-clean 项目地址: https://gitcode.com/GitHub_Trending/by/bypass-paywalls-chrome-clean 想要突破付费墙限制&#xff0c;畅享免费内容阅读体验&#xff1f;智能内容…

效率与善意:当教育科技回归“服务育人”的本心

在教育与知识服务领域&#xff0c;我们长期面临一个核心矛盾&#xff1a;教育者心怀“有教无类”、“因材施教”的宏大善意与理想&#xff0c;却常常被困于时间、精力与运营效率的现实壁垒。这份“善意”如何能通过更优的路径&#xff0c;无损地、甚至放大式地传递出去&#xf…

deepseek-关于国家发改委研究设立国家级并购基金的新闻深度解析及A股行情影响总结报告

关于国家发改委研究设立国家级并购基金的新闻深度解析及A股行情影响总结报告 报告日期: 2026年1月20日 核心事件: 国家发改委在新闻发布会上提出“研究设立国家级并购基金”,旨在推动产业整合升级,加快培育新质生产…

【实战项目】 基于ssm的前后端分离毕业设计管理系统

运行效果:https://lunwen.yeel.cn/view.php?id=5902 基于ssm的前后端分离毕业设计管理系统摘要:随着信息技术的飞速发展,传统的毕业设计管理系统已无法满足现代教育管理的需求。本研究针对当前毕业设计管理系统中存…

2026 展馆展厅设计公司推荐:细分场景下的精准赋能 - 品牌推荐排行榜

​一、行业市场概况及特点解析 2026 年中国展馆展厅设计市场规模预计突破 1380 亿元,年复合增长率达 15.8%,行业增长动力呈现 “三极驱动” 格局:智慧医疗展厅(增速 41%)、跨境电商展厅(增速 37%)、中小企业数字…

2026年龙芯商务主板厂家推荐:龙芯服务器/龙芯2K3000主板/龙芯3C6000服务器主板/龙芯3A6000主板/龙芯宽温主板源头厂家精选

在国产计算机自主化进程中,北京集特智能科技有限公司凭借其全产业链布局和技术沉淀,成为信创领域的重要参与者。作为信创技术活动单位之一,公司以龙芯系列CPU为核心,构建了覆盖商务办公、工业控制、服务器等场景的…

长沙英语雅思培训辅导机构推荐.2026年权威出国雅思课程中心学校口碑排行榜 - 老周说教育

基于权威留学语言行业调研数据,结合雅思考生核心诉求,本次通过全面深度测评,从师资资质、教学效果、课程适配性、服务质量、性价比五大核心维度,构建量化评分体系,整理出2026年长沙英语雅思培训辅导机构口碑排行榜…

WechatBakTool:3步轻松备份微信聊天记录的完整指南

WechatBakTool&#xff1a;3步轻松备份微信聊天记录的完整指南 【免费下载链接】WechatBakTool 基于C#的微信PC版聊天记录备份工具&#xff0c;提供图形界面&#xff0c;解密微信数据库并导出聊天记录。 项目地址: https://gitcode.com/gh_mirrors/we/WechatBakTool 在数…

郑州英语雅思培训辅导机构推荐.2026年权威出国雅思课程中心学校口碑排行榜 - 老周说教育

据《2025郑州雅思培训行业白皮书》数据显示,郑州地区雅思考试报名人数年均增长18%,但超70%考生面临选课迷茫、提分缓慢、技巧缺失等核心痛点,如何在众多教育机构中筛选出靠谱、优质的备考方案,成为考生及家长的核心…

唐山英语雅思培训辅导机构推荐。2026年权威出国雅思课程中心学校口碑排行榜 - 老周说教育

基于2026年雅思考试命题趋势、行业权威教研数据及海量学员真实反馈,本次通过全面、深度测评,围绕机构资质、师资实力、提分效果、性价比、个性化方案等核心维度,打造唐山地区雅思培训辅导机构口碑排名,为考生筛选靠…

长沙英语雅思培训辅导机构推荐,2026年权威出国雅思课程中心学校口碑排行榜 - 老周说教育

在雅思备考赛道中,选课难、提分慢、优质教育机构甄别不易等问题,成为多数考生的备考阻碍。如何在众多机构中筛选出靠谱且性价比高的选择,快速掌握提分技巧、实现高分目标,是每一位雅思考生的核心诉求。为破解这一困…

Gemini-国家级并购基金信号深度解读及A股投资策略报告

国家级并购基金信号深度解读及A股投资策略报告(2025.1.20) 报告导读: 本报告聚焦于国家发改委最新释放的“研究设立国家级并购基金”这一重磅信号。我们将深度解析其背后的战略意图,并结合当前A股“结构分化”的行…

Git在Windows环境下的安装与使用教程 - 指南

pre { white-space: pre !important; word-wrap: normal !important; overflow-x: auto !important; display: block !important; font-family: "Consolas", "Monaco", "Courier New", …

SSAS - 步骤二:创建数据源

本文介绍新建了SSAS项目后&#xff0c;如何创建数据源。 右键点击数据源-新建数据源。在弹出的框中&#xff0c;选择基于已有或新建的连接来创建数据源。3. 可以在数据连接处选择已有连接。若需新建连接&#xff0c;点击新建&#xff0c;弹出数据源配置框&#xff0c;进行配置&…

阴阳师自动挂机脚本终极指南:轻松实现多开护肝体验

阴阳师自动挂机脚本终极指南&#xff1a;轻松实现多开护肝体验 【免费下载链接】yysScript 阴阳师脚本 支持御魂副本 双开 项目地址: https://gitcode.com/gh_mirrors/yy/yysScript 还在为阴阳师御魂副本的重复刷取而烦恼吗&#xff1f;yysScript阴阳师自动挂机脚本为您…