I followed the MMIO Peripherals page from the Chipyard documentation to learn about adding modules to rocket-chip within Chipyard framework - and all that seems to have worked pretty well. I summed up my experiences and tried to write it in a slower pace on the pages of the Chisel Learning Journey <== adding that only if the person answering question may want to take a look and see that I've got everything working correctly. In other words, I added the MMIO with in the example package of Chipyard and it compiles, generates simulator, responds properly to toy benchmark I devised, I even see the corresponding waveforms in gtkwave.
Now, the next step I would like to take is to separate this dummy design (it literally just reads from a memory mapped register that holds a hardcoded value) from the chipyard/rocket-chip infrastructure in the sense that it is housed in a separate repo, that will become a submodule of my chipyard. So, to do that, I've started from this page and took all the steps as given there:
a new repo was created, called it my-chip
into the my-chip I added build.sbt of the following content:
organization := "My Chip"
version := "1.0"
name := "my-chip"
scalaVersion := "2.12.10"
into the my-chip I added src/main/scala/JustRead.scala the one built through the mentioned pages of the Learning Journey
within the JustRead.scala I only replaced the package line so that now it is:
package my-chip
then, I added the my-chip repo as a submodule to path: chipyard/generators/my-chip
finally I added appropriate lines to chipyard/generators/chipyard/src/main/scala/DigitalTop.scala as follows:
package chipyard
import chisel3._
import freechips.rocketchip.subsystem._
import freechips.rocketchip.system._
import freechips.rocketchip.config.Parameters
import freechips.rocketchip.devices.tilelink._
// ------------------------------------
// BOOM and/or Rocket Top Level Systems
// ------------------------------------
// DOC include start: DigitalTop
class DigitalTop(implicit p: Parameters) extends System
with testchipip.CanHaveTraceIO // Enables optionally adding trace IO
with testchipip.CanHaveBackingScratchpad // Enables optionally adding a backing scratchpad
with testchipip.CanHavePeripheryBlockDevice // Enables optionally adding the block device
with testchipip.CanHavePeripherySerial // Enables optionally adding the TSI serial-adapter and port
with sifive.blocks.devices.uart.HasPeripheryUART // Enables optionally adding the sifive UART
with sifive.blocks.devices.gpio.HasPeripheryGPIO // Enables optionally adding the sifive GPIOs
with sifive.blocks.devices.spi.HasPeripherySPIFlash // Enables optionally adding the sifive SPI flash controller
with icenet.CanHavePeripheryIceNIC // Enables optionally adding the IceNIC for FireSim
with chipyard.example.CanHavePeripheryInitZero // Enables optionally adding the initzero example widget
with chipyard.example.CanHavePeripheryGCD // Enables optionally adding the GCD example widget
with my-chip.CanHavePeripheryJustRead // <=== ADDED THIS LINE
with chipyard.example.CanHavePeripheryStreamingFIR // Enables optionally adding the DSPTools FIR example widget
with chipyard.example.CanHavePeripheryStreamingPassthrough // Enables optionally adding the DSPTools streaming-passthrough example widget
with nvidia.blocks.dla.CanHavePeripheryNVDLA // Enables optionally having an NVDLA
{
override lazy val module = new DigitalTopModule(this)
}
class DigitalTopModule[+L <: DigitalTop](l: L) extends SystemModule(l)
with testchipip.CanHaveTraceIOModuleImp
with testchipip.CanHavePeripheryBlockDeviceModuleImp
with testchipip.CanHavePeripherySerialModuleImp
with sifive.blocks.devices.uart.HasPeripheryUARTModuleImp
with sifive.blocks.devices.gpio.HasPeripheryGPIOModuleImp
with sifive.blocks.devices.spi.HasPeripherySPIFlashModuleImp
with icenet.CanHavePeripheryIceNICModuleImp
with chipyard.example.CanHavePeripheryGCDModuleImp
with my-chip.CanHavePeripheryJustReadTopModuleImp // <=== AND THIS LINE
with freechips.rocketchip.util.DontTouch
// DOC include end: DigitalTop
Then, I created a config in chipyard/generators/chipyard/src/main/scala/configs/RocketConfig.scala as follows:
class JustReadTLRocketConfig extends Config(
new chipyard.iobinders.WithUARTAdapter ++
new chipyard.iobinders.WithTieOffInterrupts ++
new chipyard.iobinders.WithBlackBoxSimMem ++
new chipyard.iobinders.WithTiedOffDebug ++
new chipyard.iobinders.WithSimSerial ++
new testchipip.WithTSI ++
new chipyard.config.WithUART ++
new chipyard.config.WithBootROM ++
new chipyard.config.WithL2TLBs(1024) ++
new my-chip.WithJustRead ++ // <=== THIS LINE ADDED
new freechips.rocketchip.subsystem.WithNoMMIOPort ++
new freechips.rocketchip.subsystem.WithNoSlavePort ++
new freechips.rocketchip.subsystem.WithInclusiveCache ++
new freechips.rocketchip.subsystem.WithNExtTopInterrupts(0) ++
new freechips.rocketchip.subsystem.WithNBigCores(1) ++
new freechips.rocketchip.subsystem.WithCoherentBusTopology ++
new freechips.rocketchip.system.BaseConfig)
Finally, I changed the main build.sbt the one in chipyard root to hold these lines additionally:
lazy val my-chip = (project in file("generators/my-chip"))
.dependsOn(rocketchip, chisel_testers, midasTargetUtils)
.settings(commonSettings)
and
lazy val chipyard = conditionalDependsOn(project in file("generators/chipyard"))
.dependsOn(boom, hwacha, sifive_blocks, sifive_cache, utilities, iocell,
sha3, // On separate line to allow for cleaner tutorial-setup patches
my-chip, // <=== ADDED THIS
dsptools, `rocket-dsptools`,
gemmini, icenet, tracegen, ariane, nvdla)
.settings(commonSettings)
Now, when running make CONFIG=JustReadTLRocketConfig, I get the following error:
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=256M; support was removed in 8.0
Picked up _JAVA_OPTIONS: -Xmx2048m
[info] Loading settings for project chipyard-build from plugins.sbt ...
[info] Loading project definition from /home/apaj/chipyard/project
[error] [/home/apaj/chipyard/build.sbt]:166: Pattern matching in val statements is not supported
Looking around didn't really help as I am lacking any scala/software building skills, so couldn't make much of this, for example...
My spider sense tells me I messed up something in the paths within internal organization of the chipyard, so... please, some help from an experienced user would be much appreciated. Thanks.
EDIT: typos and adding point 8.
The error comes from the - in lazy val my-chip and package my-chip. If you want to use a - in a scala name you can wrap the name in backticks, like `my-chip`.
Related
I have a GDScript file, and I would like to be able to run a block of code whenever the script is loaded. I know _init will run whenever an instance is constructed and _ready will run when it's added to the scene tree. I want to run code before either of those events happens; when the preload or load that brings it into memory first happens.
In Java, this would be a static initializer block, like so
public class Example {
public static ArrayList<Integer> example = new ArrayList<Integer>;
static {
// Pretend this is a complicated algorithm to compute the example array ...
example.add(1);
example.add(2);
example.add(3);
}
}
I want to do the same thing with a GDScript file in Godot, where some top-level constant or other configuration data is calculated when the script is first loaded, something like (I realize this is not valid GDScript)
extends Node
class_name Example
const EXAMPLE = []
static:
EXAMPLE.push_back(1)
EXAMPLE.push_back(2)
EXAMPLE.push_back(3)
Is there some way to get a feature like this or even to approximate it?
Background Information: The reason I care (aside from a general academic interest in problems like these) is that I'm working on a scripting language that uses GDScript as a compile target, and in my language there are certain initialization and registration procedures that need to happen to keep my language's runtime happy, and this needs to happen whenever a new script from my language is loaded.
In reality the thing I want to do is basically going to be
static:
MyLanguageRuntime.register_class_file(this_file, ...)
where MyLanguageRuntime is a singleton node that exists at the top of the scene tree to keep everything running smoothly.
Android gradle plugin generates tons of .rawproto files in build/android-profile directory. What are they used for? Is there a way to disable this madness or automatically delete them?
I've been bugged by it for a long time, and now that I noticed there's gigabytes of this hogging my smallish SSD, I've decided to figure out a way to disable it. For me the most annoying thing before occupying too much space was gradlew clean leaving a build folder behind.
Only tested with com.android.tools.build:gradle:3.0.1, so YMMV.
TL;DR
For global application read last section, per-project use this in rootProject's build.gradle:
com.android.build.gradle.internal.profile.ProfilerInitializer.recordingBuildListener =
new com.android.build.gradle.internal.profile.RecordingBuildListener(
com.android.builder.profile.ProcessProfileWriter.get());
// and then `gradlew --stop && gradlew clean` to verify no build folder is left behind
Investigation
Thanks to https://stackoverflow.com/a/43910084/253468 linked by #JeffRichards mentioning ProcessProfileWriterFactory.java, I've put a breakpoint there and checked who's calling it by running gradlew -Dorg.gradle.debug=true --info (not to be confused with --debug) and attaching a remote debugger.
I followed the trail and found that ProcessProfileWriter.finishAndMaybeWrite creates the folder and writes. Backtracing on method calls I found that ProfilerInitializer.recordingBuildListener controls whether it's called ... and that is initialized directly by BasePlugin (apply plugin: 'com.android.*').
So in order to prevent anything from happening I opted to try to disable the guard, by pre-initialized that static field. Thankfully Groovy (and hence Gradle) doesn't give a * about JVM visibility modifiers, so without reflection here's the magic line:
com.android.build.gradle.internal.profile.ProfilerInitializer.recordingBuildListener =
new com.android.build.gradle.internal.profile.RecordingBuildListener(
com.android.builder.profile.ProcessProfileWriter.get());
I know, it's a bit verbose, but it works, and if you import stuff it looks better:
ProfilerInitializer.recordingBuildListener = new RecordingBuildListener(ProcessProfileWriter.get());
Applying the magic
In a single-project build (one build.gradle) you must apply this before
apply plugin: 'com.android.application'
In multi-project builds (most template projects: app folder, settings.gradle, and many build.gradles) I suggest you apply it around the buildscript block:
buildscript {
// ...
dependencies {
classpath 'com.android.tools.build:gradle:3.0.1'
}
}
// magic line here
Make sure it's before any apply plugin:s, and not inside a buildscript block.
Applying the magic globally
Obviously if it bothers us in one project, it will in all cases, so following Gradle's manual, create a file in ~/.gradle/init.gradle or %USERPROFILE%\.gradle\init.gradle (mind you this folder can be relocated with GRADLE_USER_HOME) with the following contents:
// NB: any changes to this script require a new daemon (`gradlew --stop` or `gradlew --no-daemon <tasks>`)
rootProject { rootProject -> // see https://stackoverflow.com/a/48087543/253468
// listen for lifecycle events on the project's plugins
rootProject.plugins.whenPluginAdded { plugin ->
// check if any Android plugin is being applied (not necessarily just 'com.android.application')
// this plugin is actually exactly for this purpose: to get notified
if (plugin.class.name == 'com.android.build.gradle.api.AndroidBasePlugin') {
logger.info 'Turning off `build/android-profile/profile-*.(rawproto|json)` generation.'
// execute the hack in the context of the buildscript, not in this initscript
new GroovyShell(plugin.class.classLoader).evaluate("""
com.android.build.gradle.internal.profile.ProfilerInitializer.recordingBuildListener =
new com.android.build.gradle.internal.profile.RecordingBuildListener(
com.android.builder.profile.ProcessProfileWriter.get());
""")
}
}
}
Or, the goal: How can I take a single package from Nix unstable in a declarative manner?
I'm new to NixOS and currently trying to install a newer version of Consul than the default 0.5.2 of my NixOS version (latest stable). I'm attempting this by overriding the derivative in my /etc/nix/configuration.nix.
I'd like to keep running stable, but I found unstable had the version of Consul that I wanted (0.7.0) already, and so I decided to use this package's attributes as a starting point to override https://github.com/NixOS/nixpkgs/blob/master/pkgs/servers/consul/default.nix
I copied it for most part into my configuration.nix, here are the relevant sections:
nixpkgs.config.packageOverrides = pkgs: rec {
consul = pkgs.lib.overrideDerivation pkgs.consul (attrs: rec {
version = "0.7.0";
name = "consul-${version}";
rev = "v${version}";
goPackagePath = "github.com/hashicorp/consul";
src = pkgs.fetchFromGitHub {
owner = "hashicorp";
repo = "consul";
inherit rev;
sha256 = "04h5y5vixjh9np9lsrk02ypbqwcq855h7l1jlnl1vmfq3sfqjds7";
};
# Keep consul.ui for backward compatability
passthru.ui = pkgs.consul-ui;
});
};
environment.systemPackages = with pkgs; [
vim
which
telnet
consul-ui
consul-alerts
consul-template
consul
];
I'm running nix-build (Nix) 1.11.2 which throws:
$ nixos-rebuild switch
building Nix...
building the system configuration...
error: cannot coerce a set to a string, at /etc/nixos/configuration.nix:19:7
(use ‘--show-trace’ to show detailed location information)
When I look at line 19 it's where name is set to "consul-${version}".
Why there is type-coercion going on here? Any tips will be greatly appreciated!
I'm also wondering if there is a better way to run just a single package in unstable, yet doing so declaratively from configuration.nix, rather than imperatively?
To add to what Rok said:
Which should point you that an error actually happens at passthru, line. If you comment it out it will probably build. I'm assuming some recursive calls are at play here and error occurs when it tries to evaluate consul/consul-ui packages.
If you're just starting out, you can safely ignore what follows and perhaps come back to it if/when you're curious about the nitty-gritty.
The problem here is that overrideDerivation is a kind of low-level approach to overriding things. Behind stdenv.mkDerivation, we have a much smaller primitive function called derivation. The derivation function takes some attributes and (more or less -- see the docs for the finer details) just passes those attributes as environment variables during the build. The stdenv.mkDerivation function, on the other hand, has a whole bunch of smarts layered on top that massages the attributes given to it before passing them onto derivation -- and in some cases, as is the case with passthru, it doesn't pass the attribute to derivation at all.
Back to overrideDerivation: it takes the final, tweaked attributes that stdenv.mkDerivation would pass to derivation, and just before that happens it allows you to override those attributes with the function you give it (e.g. that implies that, at that point, passthru has already been removed). When your function adds a passthru, that makes its way into derivation, which then wants to coerce the value of passthru into a string so it can make passthru an environment variable during the build; however, because passthru now points at a attribute-set, and such coercion isn't supported, Nix then complains.
So this sort of puts us in an odd situation. To illustrate, I'll copy the source for the consul package here:
{ stdenv, lib, buildGoPackage, consul-ui, fetchFromGitHub }:
buildGoPackage rec {
name = "consul-${version}";
version = "0.6.4";
rev = "v${version}";
goPackagePath = "github.com/hashicorp/consul";
src = fetchFromGitHub {
owner = "hashicorp";
repo = "consul";
inherit rev;
sha256 = "0p6m2rl0d30w418n4fzc4vymqs3vzfa468czmy4znkjmxdl5vp5a";
};
# Keep consul.ui for backward compatability
passthru.ui = consul-ui;
}
(Note that buildGoPackage is a wrapper around stdenv.mkDerivation.)
You might be familiar with e.g. consul.override, which allows you to supply different inputs (e.g. maybe a different version of consul-ui, or buildGoPackage), but it doesn't allow you to override things that aren't inputs (e.g. src, passthru, etc). Meanwhile, overrideDerivation allows you to modify the attrs given to derivation, but not the ones given to stdenv.mkDerivation. Ideally there would be something in-between, that would allow for manipulating the attrs given to stdenv.mkDerivation, and it so happens that there's a PR open to address this:
https://github.com/NixOS/nixpkgs/pull/18660
Welcome to Nix/NixOS :)
Whenever you need to know more about the error you can use --show-trace and that would give you more verbose error. In your case you would see something like
error: while evaluating the attribute ‘passthru’ of the derivation ‘consul-0.7.0’ at /home/rok/tmp/consul.nix:6:3:
cannot coerce a set to a string, at /home/rok/tmp/consul.nix:6:3
Which should point you that an error actually happens at passthru, line. If you comment it out it will probably build. I'm assuming some recursive calls are at play here and error occurs when it tries to evaluate consul/consul-ui packages.
As for overriding only one package from unstable channel something like this is needed
let
unstable_pkgs = import ./path/to/unstabe/nixpkgs {};
# or
# unstable_pkgs = import (pkgs.fetchFromGitHub {...}) {};
in {
...
nixpkgs.config.packageOverrides = pkgs: rec {
consul = unstable_pkgs.consul;
};
...
}
I haven't try the above, but I'm assuming it will work.
I'd like a (platform independent) way to list all classes from a package.
A possible way would be get a list of all classes known by Haxe, then filtering through it.
I made a macro to help with just this. It's in the compiletime haxelib.
haxelib install compiletime
And then add it to your build (hxml file):
-lib compiletime
And then use it to import your classes:
// All classes in this package, including sub packages.
var testcases = CompileTime.getAllClasses("my.package");
// All classes in this package, not including sub packages.
var testcases = CompileTime.getAllClasses("my.package",false);
// All classes that extend (or implement) TestCase
var testcases = CompileTime.getAllClasses(TestCase);
// All classes in a package that extend TestCase
var testcases = CompileTime.getAllClasses("my.package",TestCase);
And then you can iterate over them:
for ( testClass in testcases ) {
var test = Type.createInstance( testClass, [] );
}
Please note, if you never have "import some.package.SomeClass" in your code, that class will never be included, because of dead code elimination. So if you want to make sure it gets included, even if you never explicitly call it in your code, you can do something like this:
CompileTime.importPackage( "mygame.levels" );
CompileTime.getAllClasses( "mygame.levels", GameLevel );
How it works
CompileTime.getAllClasses is a macro, and what it does is:
Waits until compilation is finished, and we know all of the types / classes in our app.
Go through each one, and see if it is in the specified package
See also if it extends (or implements) the specified class/interface
If it does, add the class name to some special metadata - #classLists(...) metadata on the CompileTimeClassList file, containing the names of all the matching classes.
At runtime, use the metadata, together with Type.resolveClass(...) to create a list of all matching types.
This is one way to store and retrieve the information: https://gist.github.com/back2dos/c9410ed3ed476ecc1007
Beyond that you could use haxe -xml to get the type information you want, then transform it as needed (use the parser from haxe.rtti to handle the data) and embed the JSON encoded result with -resource theinfo.json (accessed through haxe.Resource).
As a side note: there are chances you'll be better off not having any automation and just add the classes to an array manually. Imagine you have somepackage.ClassA, somepackage.ClassB, ... then you can do
import somepackage.*;
//...
static var CLASSES:Array<Class<Dynamic>> = [ClassA, ClassB, ...];
It gives you more flexibility as whatever you want to do, you can always add 3rd party classes, which may not necessarily be in the same package and you can also choose to not use a class without having to delete it.
I am facing an issue in declaring a fucation in block which I have added . I am calling a function by including an file which ia placed in the theme. I have also tried it by placing out of the theme folder. The function is alredy being user in front page. But when I am using the same function in that block. The screen gets blank and nothing displays. some part of my block coding is written below. Please help me.
<?php
global $base_url;
include($_SERVER['DOCUMENT_ROOT']."/travellar/geoiploc.php"); // indluded file
$ip = "203.189.25.0"; // Australia IP test
$country_code = getCountryFromIP($ip, "code");
I've had no problems loading functions from modules into custom blocks, but I've never tried loading one from a theme before. It's not clear to me whether or not theme functions are loaded before the page content is loaded.
You might have to create a custom module or include file to hold the function. Check out the module_load_include() function for how to load a specific include file.
A custom module would be a good approach as it is loaded before the theme layer and can be accessed from almost anywhere in Drupal except other modules with a lower weight than the custom module. It is also likely to come in handy for hooks and other overrides.
However, if you must have it in the theme layer, another option is adding it to template.php of your theme which should make it available within page.tpl.php and such, but not blocks I don't believe.
/sites/all/modules/mymodule/mymodule.info
name = My Module
package = !
description = It is MY module, not yours!
core = 6.x
The package "!" will make this module appear at the top of the modules page
/sites/all/modules/mymodule/mymodule.module
<?php
// Load mymodule.morePHP.inc
module_load_include('inc', 'mymodule', 'mymodule.morePHP');
// A custom function
function mymodule_my_custom_function($args) {
/* do custom stuff here */
return 'output';
}
/sites/all/modules/mymodule/mymodule.morePHP.inc
<?php
// An included custom function
function mymodule_other_custom_stuff() {
}