解决pyspark的py4j.protocol.Py4JError: An error occurred while calling o84.getstate错误
在pyspark使用了udf,然后就报错下面错误:
File "D:\ProgramData\Anaconda3\lib\site-packages\pyspark\serializers.py", line 587, in dumps
return cloudpickle.dumps(obj, 2)
File "D:\ProgramData\Anaconda3\lib\site-packages\pyspark\cloudpickle.py", line 863, in dumps
cp.dump(obj)
File "D:\ProgramData\Anaconda3\lib\site-packages\pyspark\cloudpickle.py", line 260, in dump
return Pickler.dump(self, obj)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 409, in dump
self.save(obj)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 736, in save_tuple
save(element)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "D:\ProgramData\Anaconda3\lib\site-packages\pyspark\cloudpickle.py", line 400, in save_function
self.save_function_tuple(obj)
File "D:\ProgramData\Anaconda3\lib\site-packages\pyspark\cloudpickle.py", line 549, in save_function_tuple
save(state)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 847, in _batch_setitems
save(v)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 847, in _batch_setitems
save(v)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 521, in save
self.save_reduce(obj=obj, *rv)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 634, in save_reduce
save(state)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 852, in _batch_setitems
save(v)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 496, in save
rv = reduce(self.proto)
File "D:\ProgramData\Anaconda3\lib\site-packages\py4j\java_gateway.py", line 1257, in call
answer, self.gateway_client, self.target_id, self.name)
File "D:\ProgramData\Anaconda3\lib\site-packages\pyspark\sql\utils.py", line 63, in deco
return f(*a, **kw)
File "D:\ProgramData\Anaconda3\lib\site-packages\py4j\protocol.py", line 332, in get_return_value
format(target_id, ".", name, value))
py4j.protocol.Py4JError: An error occurred while calling o84.getstate. Trace:
py4j.Py4JException: Method getstate([]) does not exist
at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:318)
at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:326)
at py4j.Gateway.invoke(Gateway.java:274)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
Traceback (most recent call last):
File "D:\ProgramData\Anaconda3\lib\site-packages\pyspark\serializers.py", line 587, in dumps
return cloudpickle.dumps(obj, 2)
File "D:\ProgramData\Anaconda3\lib\site-packages\pyspark\cloudpickle.py", line 863, in dumps
cp.dump(obj)
File "D:\ProgramData\Anaconda3\lib\site-packages\pyspark\cloudpickle.py", line 260, in dump
return Pickler.dump(self, obj)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 409, in dump
self.save(obj)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 736, in save_tuple
save(element)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "D:\ProgramData\Anaconda3\lib\site-packages\pyspark\cloudpickle.py", line 400, in save_function
self.save_function_tuple(obj)
File "D:\ProgramData\Anaconda3\lib\site-packages\pyspark\cloudpickle.py", line 549, in save_function_tuple
save(state)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 847, in _batch_setitems
save(v)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 847, in _batch_setitems
save(v)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 521, in save
self.save_reduce(obj=obj, *rv)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 634, in save_reduce
save(state)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 852, in _batch_setitems
save(v)
File "D:\ProgramData\Anaconda3\lib\pickle.py", line 496, in save
rv = reduce(self.proto)
File "D:\ProgramData\Anaconda3\lib\site-packages\py4j\java_gateway.py", line 1257, in call
answer, self.gateway_client, self.target_id, self.name)
File "D:\ProgramData\Anaconda3\lib\site-packages\pyspark\sql\utils.py", line 63, in deco
return f(*a, **kw)
File "D:\ProgramData\Anaconda3\lib\site-packages\py4j\protocol.py", line 332, in get_return_value
format(target_id, ".", name, value))
py4j.protocol.Py4JError: An error occurred while calling o84.getstate. Trace:
py4j.Py4JException: Method getstate([]) does not exist
at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:318)
at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:326)
at py4j.Gateway.invoke(Gateway.java:274)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
刚开始以为pyspark的包有问题,后来仔细研究一下,才发现是下面的问题:
这个错误通常发生在尝试在 PySpark 中注册一个用户定义的函数(UDF)时,而该函数使用了不支持序列化的对象或方法。错误信息中的 __getstate__
方法缺失提示 Py4J 无法正确地将 Python 对象转换为 Java 对象,这通常是因为 UDF 内部引用了 Spark DataFrame 函数或某些类实例,这些在 Py4J 上下文里是不可序列化的。
原来是在UDF用了lag
函数,这是不允许的,因为 UDF 必须是可序列化的,而 DataFrame 函数是不允许。
由于需要使用lag函数,解决方法是不用UDF来实现。