MIDIutil is not creating/writing files - python-3.x

I'm trying to use midiutil to create a program that converts text into a midi file. I have thus far been able to do everything successfully, however midiutil does not appear to be functioning correctly. When I execute the code, it appears to complete fine, but no file is written. I've tried copying and pasting the default example from the documentation, and the same problem occurs. The code appears to run correctly, but no file is created. I have also tried manually creating the file first in hopes that it might edit the existing file, but that also doesn't work. For reference, here is the example code from the documentation that is NOT working correctly when I attempt to run it.
from midiutil import MIDIFile
degrees = [60, 62, 64, 65, 67, 69, 71, 72] # MIDI note number
track = 0
channel = 0
time = 0 # In beats
duration = 1 # In beats
tempo = 60 # In BPM
volume = 100 # 0-127, as per the MIDI standard
MyMIDI = MIDIFile(1) # One track, defaults to format 1 (tempo track
# automatically created)
MyMIDI.addTempo(track,time, tempo)
for pitch in degrees:
MyMIDI.addNote(track, channel, pitch, time, duration, volume)
time = time + 1
with open("major-scale.mid", "wb") as output_file:
MyMIDI.writeFile(output_file)

Related

Vertex AI scheduled notebooks doesn't recognize existence of folders

I have a managed jupyter notebook in Vertex AI that I want to schedule. The notebook works just fine as long as I start it manually, but as soon as it is scheduled, it fails. There are in fact many things that go wrong when scheduled, some of them are fixable. Before explaining what my trouble is, let me first give some details of the context.
The notebook gathers information from an API for several stores and saves the data in different folders before processing it, saving csv-files to store-specific folders and to bigquery. So, in the location of the notebook, I have:
The notebook
Functions needed for the handling of data (as *.py files)
A series of folders, some of which have subfolders which also have subfolders
When I execute this manually, no problem. Everything works well and all files end up exactly where they should, as well as in different bigQuery tables.
However, when scheduling the execution of the notebook, everything goes wrong. First, the files *.py cannot be read (as import). No problem, I added the functions in the notebook.
Now, the following error is where I am at a loss, because I have no idea why it does work or how to fix it. The code that leads to the error is the following:
internal = "https://api.************************"
df_descriptions = []
storess = internal
response_stores = requests.get(storess,auth = HTTPBasicAuth(userInternal, keyInternal))
pathlib.Path("stores/request_1.json").write_bytes(response_stores.content)
filepath = "stores"
files = os.listdir(filepath)
for file in files:
with open(filepath + "/"+file) as json_string:
jsonstr = json.load(json_string)
information = pd.json_normalize(jsonstr)
df_descriptions.append(information)
StoreINFO = pd.concat(df_descriptions)
StoreINFO = StoreINFO.dropna()
StoreINFO = StoreINFO[StoreINFO['storeIdMappings'].map(lambda d: len(d)) > 0]
cloud_store_ids = list(set(StoreINFO.cloudStoreId))
LastWeek = datetime.date.today()- timedelta(days=2)
LastWeek =np.datetime64(LastWeek)
and the error reported is:
FileNotFoundError Traceback (most recent call last)
/tmp/ipykernel_165/2970402631.py in <module>
5 storess = internal
6 response_stores = requests.get(storess,auth = HTTPBasicAuth(userInternal, keyInternal))
----> 7 pathlib.Path("stores/request_1.json").write_bytes(response_stores.content)
8
9 filepath = "stores"
/opt/conda/lib/python3.7/pathlib.py in write_bytes(self, data)
1228 # type-check for the buffer interface before truncating the file
1229 view = memoryview(data)
-> 1230 with self.open(mode='wb') as f:
1231 return f.write(view)
1232
/opt/conda/lib/python3.7/pathlib.py in open(self, mode, buffering, encoding, errors, newline)
1206 self._raise_closed()
1207 return io.open(self, mode, buffering, encoding, errors, newline,
-> 1208 opener=self._opener)
1209
1210 def read_bytes(self):
/opt/conda/lib/python3.7/pathlib.py in _opener(self, name, flags, mode)
1061 def _opener(self, name, flags, mode=0o666):
1062 # A stub for the opener argument to built-in open()
-> 1063 return self._accessor.open(self, flags, mode)
1064
1065 def _raw_open(self, flags, mode=0o777):
FileNotFoundError: [Errno 2] No such file or directory: 'stores/request_1.json'
I would gladly find another way to do this, for instance by using GCS buckets, but my issue is the existence of sub-folders. There are many stores and I do not wish to do this operation manually because some retailers for which I am doing this have over 1000 stores. My python code generates all these folders and as I understand it, this is not feasible in GCS.
How can I solve this issue?
GCS uses a flat namespace, so folders don't actually exist, but can be simulated as given in this documentation.For your requirement, you can either use absolute path (starting with "/" -- not relative) or create the "stores" directory (with "mkdir"). For more information you can check this blog.

Django file object always 0 bytes when uploaded from python requests

I have been trying to upload a file to django REST using python requests.
I put the file, and some other data, to the server.
r = self.session.put(
f"{hello_url}/shadow_pbem/savefile_api/",
files=test_files,
data={"hash": test_file_hash, 'leader': 78},
headers=good_token_header,
)
I get a 200 response, the model saves all the data correctly as expected, including a correctly named save file in /media, except the save file in /media is always 0 bytes.
This is how I create the file object...
with open(testfile_path, "rb") as testfile:
...and verify the length, which is not 0.
testfile.seek(0, os.SEEK_END)
filesize = testfile.tell()
I create the files object for upload...
test_files = {
"file": ("testfile.zip", testfile, "application/zip")
}
I put some code in the view to verify, and the file object in the view is there, but it is 0 bytes.
here is the relevent part of the view. It seems to work fine, but all files are 0 bytes.
class SaveFileUploadView(APIView):
parser_class = (FileUploadParser,)
def put(self, request):
if "file" not in request.data:
raise ParseError("Empty content")
f = request.data["file"]
print(f"file {f} size:{f.size}")
# prints file testfile.zip size:0
# rest of view works fine...
I have tried with various files and formats, also using post. Files are always 0 bytes.
Any help appreciated I am going crazy....
If you do
testfile.seek(0, os.SEEK_END)
filesize = testfile.tell()
as you say,
you'll need to also rewind back to the start – otherwise there is indeed zero bytes for Requests to read anymore.
testfile.seek(0)

How to modify TIF file's EXIF data

I am trying to modify existing metadata within python 3. More specifically I have GPS coordinates and altitude in a my metadata, and I need to modify it.
I'm using piexif mudule, and I ancounter two problems.
First, I managed to change Altitude, using
exif_dict['GPS'][piexif.GPSIFD.GPSAltitude] = (140, 1)
and it works.
But I can't understand how to change Latitude and Longtitude? as they consist of three fields, like ((53, 1), (291191, 10000), (0, 1)).
The second problem occurs when I try to save tiff file with modified metadata. If I save it as TIFF file:
img.save(fname_2, 'tiff', exif=exif_bytes),
the fname_2 file is created, but it's metadata isn't changed. If Isave as JPEG -
img.save(fname_2, 'jpeg', exif=exif_bytes)
- metadata changes, but the file is compressed from 289 MB to 15 MB, that makes it impossible to use it for my purposes.
Has anyone managed to do this? It sounds like it would be very simple, but I can't seem to work it out.
import piexif
from PIL import Image
Image.MAX_IMAGE_PIXELS = 1000000000
fname_1='D:\EZG\Codding\photo\iiq/eee.tif'
fname_2='D:\EZG\Codding\photo\iiq/eee_change.tif'
img = Image.open(fname_1)
exif_dict = piexif.load(fname_1)
latitide = exif_dict['GPS'][piexif.GPSIFD.GPSLatitude]
longtitude = exif_dict['GPS'][piexif.GPSIFD.GPSLongitude]
altitude = exif_dict['GPS'][piexif.GPSIFD.GPSAltitude]
print(latitide)
print(longtitude)
print(altitude)
exif_dict['GPS'][piexif.GPSIFD.GPSAltitude] = (140, 1)
exif_bytes = piexif.dump(exif_dict)
img.save(fname_2, 'tiff', exif=exif_bytes)
the fname_2 file is created, but it's metadata isn't changed
Based on other questions and answers on SO it seems that the values are encoded as fractions:
((53, 1), (291191, 10000), (0, 1))
is 53 degrees 291191/10000 = 29.1191 minutes North (0 == N; 1 == S)
You may also want to check this answer, as there is a better package to edit GPS coordinates in photo metadata.

Timing issue in windows rather than linux

I have the following function from a colleague who was previously working for the company and the comments are self explanatory, the problem is I'm right now using windows, and there issues with the synchornization with the device.
Would someone address or know a solution in windows for sync with a device ?
def sync_time(self):
"""Sync time on SmartScan."""
# On a SmartScan time can be set only by the precision of seconds
# So we need to wait for the next full second until we can send
# the packet on it's way to the scanner.
# It's not perfect, but the error should be more or less constant.
message = Maint()
message.state = message.OP_NO_CHANGE
now = datetime.datetime.utcnow()
epoch = datetime.datetime(1970, 1, 1)
# int and datetime objects
seconds = int((now - epoch).total_seconds()) + 1 # + sync second
utctime = datetime.datetime.utcfromtimestamp(seconds)
# wait until next full second
# works only on Linux with good accuracy
# Windows needs another approach
time.sleep((utctime - datetime.datetime.utcnow()).total_seconds())
command = MaintRfc()
command.command = command.SET_CLOCK
command.data = (seconds, )
message.add_message(command)
self._handler.sendto(message)
LOG.debug("Time set to: %d = %s", seconds, utctime)

Lua string from file

I'm trying to make a system which backs up and restores points for a gameserver, so it can safely restart without loosing anything.
I have made a script to do just this and the actual backing up part works fine, but the restore part does not.
This is the script that runs if 'Backup(read)' is used (Backup(write) works perfectly as it is designed to do):
if (source and read) then
System.LogAlways("[System] Restoring serverdata from file 'backup.CHK'");
for line in source:lines() do
Backup = {};
Backup.Date = (Date or line:match("File Last Modified: (.-)"));
Backup.Time = (Time or line:match("time: (.-)"));
US = tonumber((US or line:match("us: (.-)")));
NK = tonumber((NK or line:match("nk: (.-)")));
local params = {class = "Player";
position = {x = 1, y = 1, z = -1000};
Respawn = { bRespawn = 0; nTimer =0; bUnique = 1; };
bUsable = 0;
orientation = {0, 90, 135};
name = "BackupEntity"; };
local ent = System.SpawnEntity(params);
g_gameRules.game:SetTeam(1, ent.id);
g_gameRules.game:SetSynchedEntityValue(playerId, 100, (NK/3));
g_gameRules.game:SetTeam(2, ent.id);
g_gameRules.game:SetSynchedEntityValue(playerId, 100, (US/3));
System.RemoveEntity(params);
end
source:close();
return;
end
I'm not sure what I'm doing wrong,and most sites that I have looked at don't help that much. The problem is that it's not reading any values from the file.
Any help will be appreciated :).
Edit:
The reason that we have to divide the score by 3 is because the server multiplies all scores by 3. If we were not to divide it by 3, then the score will always be 3 times larger on each restore.
Example contents of the backup.CHK file:
The server is dependent on this file, and writes to it every hour. Please do not edit.
File Last Modified: 11/07/2013
This file was generated by the servers' autobackup system.
--------------------------
time: 22:51
us: 453445
nk: 454567
A couple of ideas of what might be causing the problem:
Use of (.-) lazy matching which matches the shortest pattern possible -- this can include an empty string. Usually, you want to make the pattern as specific as possible while still matching the required possible inputs. eg. It looks like (%d+) for us and nk is an appropriate fit.
The for line in source:lines() do reads one line at a time. That necessarily means not all the variables are going to be set inside the loop. Yet everything starting at local params and down uses those variables as if they were. It seems to me that section of code shouldn't even be in the loop.
Lastly, have you considered saving the Backup file as just another lua file? Doing so means you can let lua do the heavy lifting for you and you won't have to bother parsing it yourself. That also minimizes the risk for error.

Resources