且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

获得 azure blob 存储的***方法是什么

更新时间:2023-02-15 15:49:30

  1. 对于在本地运行的 Spark,有一个官方的 博客 介绍了如何从 Spark 访问 Azure Blob 存储.关键是你需要在core-site.xml文件中将Azure Storage account配置为HDFS兼容的存储,并添加两个jars hadoop-azure &azure-storage 到您的类路径,以便通过 wasb[s] 协议访问 HDFS.可以参考官方教程 了解与 wasb 兼容的 HDFS 存储,以及 博客 有关 HDInsight 配置的更多详细信息.
  2. 对于在Azure上运行的Spark,区别只是只用wasb访问HDFS,其他的准备工作都由Azure在使用 Spark 创建 HDInsight 集群.列出文件的方法是 listFilesWholeTextFiles of SparkContext
  1. For Spark running on local, there is an official blog which introduces how to access Azure Blob Storage from Spark. The key is that you need to configure Azure Storage account as HDFS-compatible storage in core-site.xml file and add two jars hadoop-azure & azure-storage to your classpath for accessing HDFS via the protocol wasb[s]. You can refer to the official tutorial to know HDFS-compatible storage with wasb, and the blog about configuration for HDInsight more details.
  2. For Spark running on Azure, the difference is just only access HDFS with wasb, the other preparations has been done by Azure when creating HDInsight cluster with Spark. The method for listing files is listFiles or wholeTextFiles of SparkContext