Started text analysing, and eventually ran into a need for downloading Corpora in using PyCharm2019 as IDE. Not really sure what traceback message wants me to do, since I used PyCharm's own lib import interface to enable Corpora already. Why does an error stating that Corpora is not available to the code keep reappearing?
Imported TextBlob, tried to do a line like: from textblob import TextBlob...view code below
from textblob import TextBlob
TextBlob(train['tweet'][1]).words
print("\nPRINT TOKENIZATION") # own instruction to allow for knowing what code result delivers
print(TextBlob(train['tweet'][1]).words)
….
Tried to install via nltk, no luck...error when downloading 'brown.tei'
showing info https://raw.githubusercontent.com/nltk/nltk_data/gh-pages/index.xml
Exception in Tkinter callback
Traceback (most recent call last):
File "C:\Users\jcst\AppData\Local\Programs\Python\Python37-32\lib\tkinter__init__.py", line 1705, in call
return self.func(*args)
File "C:\Users\jcst\PycharmProjects\TextMining\venv\lib\site-packages\nltk\downloader.py", line 1796, in _download
return self._download_threaded(*e)
File "C:\Users\jcst\PycharmProjects\TextMining\venv\lib\site-packages\nltk\downloader.py", line 2082, in _download_threaded
assert self._download_msg_queue == []
AssertionError
Traceback (most recent call last):
File "C:\Users\jcst\PycharmProjects\TextMining\venv\lib\site-packages\textblob\decorators.py", line 35, in decorated
return func(*args, **kwargs)
File "C:\Users\jcst\PycharmProjects\TextMining\venv\lib\site-packages\textblob\tokenizers.py", line 57, in tokenize
return nltk.tokenize.sent_tokenize(text)
File "C:\Users\jcst\PycharmProjects\TextMining\venv\lib\site-packages\nltk\tokenize__init__.py", line 104, in sent_tokenize
tokenizer = load('tokenizers/punkt/{0}.pickle'.format(language))
File "C:\Users\jcst\PycharmProjects\TextMining\venv\lib\site-packages\nltk\data.py", line 870, in load
opened_resource = _open(resource_url)
Resource File "C:\Users\jcst\PycharmProjects\TextMining\venv\lib\site-packages\nltk\data.py", line 995, in open
punkt not found.
Please use the NLTK Downloader to obtain the resource:
return find(path, path + ['']).open()
File "C:\Users\jcst\PycharmProjects\TextMining\venv\lib\site-packages\nltk\data.py", line 701, in find
import nltk
nltk.download('punkt')
For more information see: https://www.nltk.org/data.html
Attempted to load tokenizers/punkt/english.pickle
Searched in:
- 'C:\Users\jcst/nltk_data'
- 'C:\Users\jcst\PycharmProjects\TextMining\venv\nltk_data'
- 'C:\Users\jcst\PycharmProjects\TextMining\venv\share\nltk_data'
- 'C:\Users\jcst\PycharmProjects\TextMining\venv\lib\nltk_data'
- 'C:\Users\jcst\AppData\Roaming\nltk_data'
- 'C:\nltk_data'
- 'D:\nltk_data'
- 'E:\nltk_data'
- ''
raise LookupError(resource_not_found)
LookupError:
Resource punkt not found.
Please use the NLTK Downloader to obtain the resource:
import nltk
nltk.download('punkt')
For more information see: https://www.nltk.org/data.html
Attempted to load tokenizers/punkt/english.pickle
Searched in:
- 'C:\Users\jcst/nltk_data'
- 'C:\Users\jcst\PycharmProjects\TextMining\venv\nltk_data'
- 'C:\Users\jcst\PycharmProjects\TextMining\venv\share\nltk_data'
- 'C:\Users\jcst\PycharmProjects\TextMining\venv\lib\nltk_data'
- 'C:\Users\jcst\AppData\Roaming\nltk_data'
- 'C:\nltk_data'
- 'D:\nltk_data'
- 'E:\nltk_data'
- ''
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:/Users/jcst/PycharmProjects/TextMining/ModuleImportAndTrainFileIntro.py", line 151, in
TextBlob(train['tweet'][1]).words
File "C:\Users\jcst\PycharmProjects\TextMining\venv\lib\site-packages\textblob\decorators.py", line 24, in get
value = obj.dict[self.func.name] = self.func(obj)
File "C:\Users\jcst\PycharmProjects\TextMining\venv\lib\site-packages\textblob\blob.py", line 649, in words
return WordList(word_tokenize(self.raw, include_punc=False))
File "C:\Users\jcst\PycharmProjects\TextMining\venv\lib\site-packages\textblob\tokenizers.py", line 73, in word_tokenize
for sentence in sent_tokenize(text))
File "C:\Users\jcst\PycharmProjects\TextMining\venv\lib\site-packages\textblob\base.py", line 64, in itokenize
return (t for t in self.tokenize(text, *args, **kwargs))
File "C:\Users\jcst\PycharmProjects\TextMining\venv\lib\site-packages\textblob\decorators.py", line 38, in decorated
raise MissingCorpusError()
textblob.exceptions.MissingCorpusError:
Looks like you are missing some required data for this feature.
To download the necessary data, simply run
python -m textblob.download_corpora
or use the NLTK downloader to download the missing data: http://nltk.org/data.html
If this doesn't fix the problem, file an issue at https://github.com/sloria/TextBlob/issues.
Related
I'm trying to test a function out to normalise text I believe from a tutorial I'm following on an AI chatbot (https://medium.com/swlh/a-chatbot-in-python-using-nltk-938a37a9eacc) under the section saying 'Steps involved' but I keep getting KeyError: 'Context' when I try copy this line from the tutorial into Spyder.
I've tried researching and going through the tutorial again and carefully spell checking my libraries to see if I've missed anything but I still haven't figured out why the key is missing so I was hoping someone here could please help?
My code
import pandas as pd
import nltk
from nltk import pos_tag # for parts of speech
from nltk import word_tokenize # to create tokens
from nltk.stem import wordnet # to perform lemmatization
from nltk.corpus import stopwords # for stop words to end prgrm
import numpy as np
import re
from sklearn.metrics import pairwise_distances # to perform cosine similarity
from sklearn.feature_extraction.text import TfidfVectorizer # to perform tfidf
from sklearn.feature_extraction.text import CountVectorizer # to perform bow
df=pd.read_excel(r'C:\Users\mecha\Documents\Comp Sci - Year 3\ISYS30221 - Artificial Intel\New Try - AI with revisions\dialog_talk_agent.xlsx') # excel file of predetermined questions and answers
df.ffill(axis = 0, inplace=True) # fills all null values with previous value in dataset (NaN = null values)
df1 = df.head(10)
def step1(x):
for i in x:
a=str(i).lower()
p=re.sub(r'[^a-z0-9]', ' ', a)
print(p)
Code snippet I run in the console after running the earlier code
step1(df1['Context'])
Error feedback in the console
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\lib\site-packages\pandas\core\indexes\base.py", line 2646, in get_loc
return self._engine.get_loc(key)
File "pandas\_libs\index.pyx", line 111, in pandas._libs.index.IndexEngine.get_loc
File "pandas\_libs\index.pyx", line 138, in pandas._libs.index.IndexEngine.get_loc
File "pandas\_libs\hashtable_class_helper.pxi", line 1619, in pandas._libs.hashtable.PyObjectHashTable.get_item
File "pandas\_libs\hashtable_class_helper.pxi", line 1627, in pandas._libs.hashtable.PyObjectHashTable.get_item
KeyError: 'Context'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<ipython-input-8-6335e79211e5>", line 1, in <module>
step1(df1['Context'])
File "C:\ProgramData\Anaconda3\lib\site-packages\pandas\core\frame.py", line 2800, in __getitem__
indexer = self.columns.get_loc(key)
File "C:\ProgramData\Anaconda3\lib\site-packages\pandas\core\indexes\base.py", line 2648, in get_loc
return self._engine.get_loc(self._maybe_cast_indexer(key))
File "pandas\_libs\index.pyx", line 111, in pandas._libs.index.IndexEngine.get_loc
File "pandas\_libs\index.pyx", line 138, in pandas._libs.index.IndexEngine.get_loc
File "pandas\_libs\hashtable_class_helper.pxi", line 1619, in pandas._libs.hashtable.PyObjectHashTable.get_item
File "pandas\_libs\hashtable_class_helper.pxi", line 1627, in pandas._libs.hashtable.PyObjectHashTable.get_item
KeyError: 'Context'
I've researched on KnowledgeHut and I get that the KeyError is because my program can't find the 'Context' key but I've been following a somewhat recent tutorial closely so I can't tell why I'm getting the error or maybe it's because I'm missing some library?
I was hoping someone on here could help me out on this while I try and learn some basics before getting onto my AI chatbot project for school.
If you look up at the top of the page the excel file has column names 'Context' and 'Text Response' so you might be missing those in your file or spelled them wrong, that's the only way it wouldn't work
step1(df1) should just work fine
I'm new to python, that is the first problem.
Secondly I am trying to automate the task of adding vector point layers from spreadsheets (xlsx-files) with Python.
The task can be done manually with the plugin "add spreadsheet layer".
I have a folder with roughly 20 xlsx-files that need to be added into the QGIS-project as vector point layers.
I have tried the following code snippet, to check if the core task of adding a spreadsheet layer actually works:
The Computer has a Win7 OS. The program in question is Python which is contained in the program QGIS 3.4.
The Plugin that I want to control through python is called "add spreadsheet layer".
from qgis.core import *
import processing
processing.run("qgis:createpointslayerfromtable",
{'INPUT':r'C:\Users\Desktop\PlayItAll\Test.xlsx',
'XFIELD':'X_Pos',
'YFIELD':'Y_Pos',
'ZFIELD':None,
'MFIELD':None,
'TARGET_CRS':QgsCoordinateReferenceSystem('EPSG:4326'),
'OUTPUT':r'memory'})
It produces this error:
File "C:/PROGRA1/QGIS31.4/apps/qgis/./python/plugins\processing\core\Processing.py", line 183, in runAlgorithm
raise QgsProcessingException(msg)
I have contacted the programmer of the plugin and he gave me this code to try:
import processing
processing.runAndLoadResults("qgis:createpointslayerfromtable",
{
'INPUT':r'C:\Users\username\Desktop\Delete\test.xlsx',
'XFIELD':'Longitude',
'YFIELD':'Latitude',
'ZFIELD':None,
'MFIELD':None,
'TARGET_CRS':QgsCoordinateReferenceSystem('EPSG:4326'),
'OUTPUT':'memory'
})
For him it worked, for me it didn't.
I got this on the processing tab:
2019-07-03T13:19:43 CRITICAL Traceback (most recent call last):
File "C:/PROGRA~1/QGIS3~1.4/apps/qgis/./python/plugins\processing\algs\qgis\PointsLayerFromTable.py", line 112, in processAlgorithm
fields, wkb_type, target_crs)
Exception: unknown
2019-07-03T13:19:43 CRITICAL Traceback (most recent call last):
File "C:/PROGRA~1/QGIS3~1.4/apps/qgis/./python/plugins\processing\algs\qgis\PointsLayerFromTable.py", line 112, in processAlgorithm
fields, wkb_type, target_crs)
Exception: unknown
2019-07-03T13:19:43 CRITICAL There were errors executing the algorithm.
The "python warnings" tab showed this:
2019-07-03T13:19:43 WARNING warning:__console__:1: ResourceWarning:
unclosed file
traceback: File "C:/PROGRA~1/QGIS3~1.4/apps/qgis/./python\console\console.py", line 575, in runScriptEditor
self.tabEditorWidget.currentWidget().newEditor.runScriptCode()
File "C:/PROGRA~1/QGIS3~1.4/apps/qgis/./python\console\console_editor.py", line 629, in runScriptCode
.format(filename.replace("\\", "/"), sys.getfilesystemencoding()))
File "C:/PROGRA~1/QGIS3~1.4/apps/qgis/./python\console\console_sci.py", line 635, in runCommand
more = self.runsource(src)
File "C:/PROGRA~1/QGIS3~1.4/apps/qgis/./python\console\console_sci.py", line 665, in runsource
return super(ShellScintilla, self).runsource(source, filename, symbol)
File "C:\PROGRA~1\QGIS3~1.4\apps\Python37\lib\code.py", line 74, in runsource
self.runcode(code)
File "C:\PROGRA~1\QGIS3~1.4\apps\Python37\lib\code.py", line 90, in runcode
exec(code, self.locals)
File "", line 1, in
I'm new to django and using the anaconda cloud environment. Its been working well for 3 months plus but as of 7-24-2018 it is just stop to launch the navigator or open through cmd. I can't manage environments or install packages. Using the anaconda prompt also gives me the same error upon launching. It started when I wanted to install django-Oscar, but not having its dependencies I was forced to install the packages manually which in turn needed cytoolz that needs Microsoft Visual C++ build tools which I got to install as well but the error persists. Please help me!
Copy/Paste of CMD Traceback:
C:\Users\kaukau\Desktop>conda --version conda 4.5.4
C:\Users\kaukau\Desktop>anaconda-navigator Traceback (most recent call
last): File
"C:\Users\kaukau\Anaconda3\lib\site-packages\qtpy__init__.py", line
169, in
from PySide import version as PYSIDE_VERSION # analysis:ignore ModuleNotFoundError: No module named 'PySide'
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File
"C:\Users\kaukau\Anaconda3\Scripts\anaconda-navigator-script.py", line
6, in
from anaconda_navigator.app.main import main File "C:\Users\kaukau\Anaconda3\lib\site-packages\anaconda_navigator\app\main.py",
line 22, in
from anaconda_navigator.utils.conda import is_conda_available File
"C:\Users\kaukau\Anaconda3\lib\site-packages\anaconda_navigator\utils__init__.py",
line 15, in
from qtpy.QtGui import QIcon File "C:\Users\kaukau\Anaconda3\lib\site-packages\qtpy__init__.py", line
175, in
raise PythonQtError('No Qt bindings could be found') qtpy.PythonQtError: No Qt bindings could be found
C:\Users\kaukau\Desktop>conda install qt --force Traceback (most
recent call last): File
"C:\Users\kaukau\Anaconda3\lib\site-packages\conda\common\configuration.py",
line 42, in
from cytoolz.dicttoolz import merge File "C:\Users\kaukau\Anaconda3\lib\site-packages\cytoolz__init__.py",
line 1, in
from .itertoolz import * ModuleNotFoundError: No module named 'cytoolz.itertoolz'
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File
"C:\Users\kaukau\Anaconda3\lib\site-packages\conda\exceptions.py",
line 819, in call
return func(*args, **kwargs) File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\cli\main.py", line
73, in _main
from ..base.context import context File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\base\context.py",
line 23, in
from ..common.configuration import (Configuration, LoadError, MapParameter, PrimitiveParameter, File
"C:\Users\kaukau\Anaconda3\lib\site-packages\conda\common\configuration.py",
line 47, in
from .._vendor.toolz.functoolz import excepts File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda_vendor\toolz\functoolz.py",
line 501
f.name for f in reversed((self.first,) + self.funcs),
^ SyntaxError: Generator expression must be parenthesized
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File
"C:\Users\kaukau\Anaconda3\lib\site-packages\conda\common\configuration.py",
line 42, in
from cytoolz.dicttoolz import merge File "C:\Users\kaukau\Anaconda3\lib\site-packages\cytoolz__init__.py",
line 1, in
from .itertoolz import * ModuleNotFoundError: No module named 'cytoolz.itertoolz'
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File
"C:\Users\kaukau\Anaconda3\Scripts\conda-script.py", line 10, in
sys.exit(main()) File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\cli\main.py", line
113, in main
return conda_exception_handler(_main, *args) File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\exceptions.py",
line 1112, in conda_exception_handler
return_value = exception_handler(func, *args, **kwargs) File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\exceptions.py",
line 822, in call
return self.handle_exception(exc_val, exc_tb) File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\exceptions.py",
line 864, in handle_exception
return self.handle_unexpected_exception(exc_val, exc_tb) File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\exceptions.py",
line 876, in handle_unexpected_exception
self.print_unexpected_error_report(error_report) File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\exceptions.py",
line 932, in print_unexpected_error_report
from .base.context import context File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\base\context.py",
line 23, in
from ..common.configuration import (Configuration, LoadError, MapParameter, PrimitiveParameter, File
"C:\Users\kaukau\Anaconda3\lib\site-packages\conda\common\configuration.py",
line 47, in
from .._vendor.toolz.functoolz import excepts File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda_vendor\toolz\functoolz.py",
line 501
f.name for f in reversed((self.first,) + self.funcs),
^ SyntaxError: Generator expression must be parenthesized Traceback (most recent call last): File
"C:\Users\kaukau\Anaconda3\lib\site-packages\conda\common\configuration.py",
line 42, in
from cytoolz.dicttoolz import merge File "C:\Users\kaukau\Anaconda3\lib\site-packages\cytoolz__init__.py",
line 1, in
from .itertoolz import * ModuleNotFoundError: No module named 'cytoolz.itertoolz'
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File
"C:\Users\kaukau\Anaconda3\lib\site-packages\conda\cli\main.py", line
97, in main
from ..activate import main as activator_main File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\activate.py", line
11, in
from .base.context import ROOT_ENV_NAME, context, locate_prefix_by_name File
"C:\Users\kaukau\Anaconda3\lib\site-packages\conda\base\context.py",
line 23, in
from ..common.configuration import (Configuration, LoadError, MapParameter, PrimitiveParameter, File
"C:\Users\kaukau\Anaconda3\lib\site-packages\conda\common\configuration.py",
line 47, in
from .._vendor.toolz.functoolz import excepts File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda_vendor\toolz\functoolz.py",
line 501
f.name for f in reversed((self.first,) + self.funcs),
^ SyntaxError: Generator expression must be parenthesized
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File
"C:\Users\kaukau\Anaconda3\lib\site-packages\conda\common\configuration.py",
line 42, in
from cytoolz.dicttoolz import merge File "C:\Users\kaukau\Anaconda3\lib\site-packages\cytoolz__init__.py",
line 1, in
from .itertoolz import * ModuleNotFoundError: No module named 'cytoolz.itertoolz'
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File
"C:\Users\kaukau\Anaconda3\Scripts\conda-script.py", line 10, in
sys.exit(main()) File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\cli\main.py", line
110, in main
return ExceptionHandler().handle_exception(exc_val, exc_tb) File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\exceptions.py",
line 864, in handle_exception
return self.handle_unexpected_exception(exc_val, exc_tb) File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\exceptions.py",
line 876, in handle_unexpected_exception
self.print_unexpected_error_report(error_report) File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\exceptions.py",
line 932, in print_unexpected_error_report
from .base.context import context File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda\base\context.py",
line 23, in
from ..common.configuration import (Configuration, LoadError, MapParameter, PrimitiveParameter, File
"C:\Users\kaukau\Anaconda3\lib\site-packages\conda\common\configuration.py",
line 47, in
from .._vendor.toolz.functoolz import excepts File "C:\Users\kaukau\Anaconda3\lib\site-packages\conda_vendor\toolz\functoolz.py",
line 501
f.name for f in reversed((self.first,) + self.funcs),
^ SyntaxError: Generator expression must be parenthesized
I've just found out that I cant even use any conda commands except the : >conda --version< command only.
Please help me resolve this issue without uninstalling then reinstalling anaconda completely
Sorry to waste your time, I solved it by reinstalling anaconda completely and every project started working again....thanks to those who read it, the effort is appreciated.
It usually happen when you directly download and insall anaconda normally form its website.
Updating anaconda solved this issue for me in Manjaro and will surely work on Windows as well.
use conda update anaconda-navigator command to Update Anaconda.
If it doesn't starts, restart your pc and try to open it again.
I am struggling to setup pymunk on my Ubuntu 16.04. I am using virtualenv, I have Python 3.5.2, pymunk 5.3.0 and cffi 1.11.0 installed.
I tried a very simple code first; basically, I created an empty Space and called step on it and everything worked smoothly. However, when I try to visualize it and create DrawOptions instance, I get strange errors, which I can't decipher. Also, I tried matplotlib_util and pygame_util, but both failed to create DrawOptions.
This is the code snippet I used:
import pymunk
import pyglet
import pymunk.pyglet_util
s = pymunk.Space()
options = pymunk.pyglet_util.DrawOptions()
s.debug_draw(options)
# s.step(0.02)
This is the output I get:
Loading chipmunk for Linux (64bit) [/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/pymunk/libchipmunk.so]
Traceback (most recent call last):
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/cffi-1.11.0-py3.5-linux-x86_64.egg/cffi/api.py", line 167, in _typeof
result = self._parsed_types[cdecl]
KeyError: 'typedef void (*cpSpaceDebugDrawCircleImpl)(cpVect pos, cpFloat angle, cpFloat radius, cpSpaceDebugColor outlineColor, cpSpaceDebugColor fillColor, cpDataPointer data)'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/cffi-1.11.0-py3.5-linux-x86_64.egg/cffi/cparser.py", line 276, in _parse
ast = _get_parser().parse(fullcsource)
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/pycparser/c_parser.py", line 152, in parse
debug=debuglevel)
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/pycparser/ply/yacc.py", line 331, in parse
return self.parseopt_notrack(input, lexer, debug, tracking, tokenfunc)
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/pycparser/ply/yacc.py", line 1199, in parseopt_notrack
tok = call_errorfunc(self.errorfunc, errtoken, self)
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/pycparser/ply/yacc.py", line 193, in call_errorfunc
r = errorfunc(token)
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/pycparser/c_parser.py", line 1761, in p_error
column=self.clex.find_tok_column(p)))
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/pycparser/plyparser.py", line 66, in _parse_error
raise ParseError("%s: %s" % (coord, msg))
pycparser.plyparser.ParseError: <cdef source string>:2:16: before: cpSpaceDebugDrawCircleImpl
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "pmtest2.py", line 5, in <module>
options = pymunk.pyglet_util.DrawOptions()
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/pymunk/pyglet_util.py", line 89, in __init__
super(DrawOptions, self).__init__()
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/pymunk/space_debug_draw_options.py", line 51, in __init__
#ffi.callback("typedef void (*cpSpaceDebugDrawCircleImpl)"
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/cffi-1.11.0-py3.5-linux-x86_64.egg/cffi/api.py", line 375, in callback
cdecl = self._typeof(cdecl, consider_function_as_funcptr=True)
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/cffi-1.11.0-py3.5-linux-x86_64.egg/cffi/api.py", line 170, in _typeof
result = self._typeof_locked(cdecl)
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/cffi-1.11.0-py3.5-linux-x86_64.egg/cffi/api.py", line 155, in _typeof_locked
type = self._parser.parse_type(cdecl)
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/cffi-1.11.0-py3.5-linux-x86_64.egg/cffi/cparser.py", line 476, in parse_type
return self.parse_type_and_quals(cdecl)[0]
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/cffi-1.11.0-py3.5-linux-x86_64.egg/cffi/cparser.py", line 479, in parse_type_and_quals
ast, macros = self._parse('void __dummy(\n%s\n);' % cdecl)[:2]
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/cffi-1.11.0-py3.5-linux-x86_64.egg/cffi/cparser.py", line 278, in _parse
self.convert_pycparser_error(e, csource)
File "/home/wm/.virtualenvs/cv/lib/python3.5/site-packages/cffi-1.11.0-py3.5-linux-x86_64.egg/cffi/cparser.py", line 307, in convert_pycparser_error
raise CDefError(msg)
cffi.error.CDefError: cannot parse "typedef void (*cpSpaceDebugDrawCircleImpl)(cpVect pos, cpFloat angle, cpFloat radius, cpSpaceDebugColor outlineColor, cpSpaceDebugColor fillColor, cpDataPointer data)"
<cdef source string>:2:16: before: cpSpaceDebugDrawCircleImpl
What do you think is causing that? Is that the python version I use, or maybe cffi compilation is faulty?
This error happens because there was a new version of pycparser (which is used by cffi) released, and that version breaks pymunk 5.3.0 and earlier versions. Yesterday I made a new release of Pymunk, 5.3.1 with a workaround for the problem. If you try to update your Pymunk version to 5.3.1 it should work.
from tweepy import Stream
from tweepy import OAuthHandler
from tweepy.streaming import StreamListener
ckey=''
csecret=''
atoken=''
asecret=''
class listener(StreamListener):
def on_data(self,data):
print(data)
return True
def on_error(self,status):
print(status)
auth = OAuthHandler(ckey,csecret)
auth.set_access_token(atoken, asecret)
twitterStream = Stream(auth, listener())
twitterStream.filter(track="cricket")
This code filter the twitter stream based on the filter. But I am getting following traceback after running the code. Can somebody please help
Traceback (most recent call last):
File "lab.py", line 23, in <module>
twitterStream.filter(track="car".strip())
File "C:\Python34\lib\site-packages\tweepy\streaming.py", line 430, in filter
self._start(async)
File "C:\Python34\lib\site-packages\tweepy\streaming.py", line 346, in _start
self._run()
File "C:\Python34\lib\site-packages\tweepy\streaming.py", line 286, in _run
raise exception
File "C:\Python34\lib\site-packages\tweepy\streaming.py", line 255, in _run
self._read_loop(resp)
File "C:\Python34\lib\site-packages\tweepy\streaming.py", line 298, in _read_loop
line = buf.read_line().strip()
File "C:\Python34\lib\site-packages\tweepy\streaming.py", line 171, in read_line
self._buffer += self._stream.read(self._chunk_size)
TypeError: Can't convert 'bytes' object to str implicitly
Im assuming you're using tweepy 3.4.0. The issue you've raised is 'open' on github (https://github.com/tweepy/tweepy/issues/615).
Two work-arounds :
1)
In streaming.py:
I changed line 161 to
self._buffer += self._stream.read(read_len).decode('UTF-8', 'ignore')
and line 171 to
self._buffer += self._stream.read(self._chunk_size).decode('UTF-8', 'ignore')
and then reinstalled via python3 setup.py install on my local copy of tweepy.
2)
remove the tweepy 3.4.0 module, and install 3.3.0 using command: pip install -I tweepy==3.3.0
Hope that helps,
-A
You can't do twitterStream.filter(track="car".strip()). Why are you adding the strip() it's serving no purpose in there.
track must be a str type before you invoke a connection to Twitter's Streaming API and tweepy is preventing that connection because you're trying to add strip()
If for some reason you need it, you can do track_word='car'.strip() then track=track_word, that's even unnecessary because:
>>> print('car'.strip())
car
Also, the error you're getting does not match the code you have listed, the code that's in your question should work fine.