conf/spark-defaults.conf に、
spark.pyspark.python /home/XXX/.pyenv/versions/anaconda2-4.4.0/bin/python
みたいに直接書くとうまくいく。
技術系の備忘録.基本的に自分だけのためのものなので,詳しく書きません.検索でいらした方、すみません.
spark.pyspark.python /home/XXX/.pyenv/versions/anaconda2-4.4.0/bin/python
Hidemotos-MacBook:rust nakada$ cargo new myProj --bin
Created binary (application) `myProj` project
$ ls myProj
Cargo.toml src
$ cat myProj/Cargo.toml
[package]
name = "myProj"
version = "0.1.0"
authors = ["nakada"]
[dependencies]
$ cat myProj/src/main.rs
fn main() {
println!("Hello, world!");
}
curl https://sh.rustup.rs -sSf | sh
$ rustc --version
rustc 1.18.0 (03fc9d622 2017-06-06)
fn main() {
println!("Hello, world!");
}
$ rustc test.rs
$ ./test
Hello, world!
$ ls -al test
-rwxr-xr-x 1 XXXXXX staff 400552 Jul 19 16:22 test