Ambiguos "Error: NEXUS__UNKNOWN__TYPE was already defined and imported as a type" error in nexus graphql - node.js

I'm getting the following error when using nexus to define a graphql schema with apollo-server.
Error: NEXUS__UNKNOWN__TYPE was already defined and imported as a type
The stacktrace doesn't give much information as to where the issue is occurring or to what the problem is. The project has 20+ models and dozens of resolvers so it's quite hard to debug.
Error: NEXUS__UNKNOWN__TYPE was already defined and imported as a type, check the docs for extending types
at extendError (/Users/username/Documents/folder/folder/graphq-nexus-prisma-api/node_modules/nexus/src/builder.ts:1744:2)
at SchemaBuilder.addType (/Users/username/Documents/folder/folder/graphq-nexus-prisma-api/node_modules/nexus/src/builder.ts:603:8)
at SchemaBuilder.missingType (/Users/username/Documents/folder/folder/graphq-nexus-prisma-api/node_modules/nexus/src/builder.ts:1212:5)
at SchemaBuilder.getOrBuildType (/Users/username/Documents/folder/folder/graphq-nexus-prisma-api/node_modules/nexus/src/builder.ts:1540:4)
at SchemaBuilder.getOutputType (/Users/username/Documents/folder/folder/graphq-nexus-prisma-api/node_modules/nexus/src/builder.ts:1471:10)
at SchemaBuilder.buildOutputField (/Users/username/Documents/folder/folder/graphq-nexus-prisma-api/node_modules/nexus/src/builder.ts:1349:52)
at /Users/username/Documents/folder/folder/graphq-nexus-prisma-api/node_modules/nexus/src/builder.ts:1307:7
at Array.forEach (<anonymous>)
at SchemaBuilder.buildOutputFields (/Users/username/Documents/folder/folder/graphq-nexus-prisma-api/node_modules/nexus/src/builder.ts:1306:7)
at fields (/Users/username/Documents/folder/folder/graphq-nexus-prisma-api/node_modules/nexus/src/builder.ts:1009:33)
Any help appreciated.

I got the same error, but it was because I removed some exports. I'm not exactly sure what caused it, but I basically have a file that exports all of my graphql modules. E.g.
// graphql/modules/index.ts
export * from './file-a';
export * from './file-b';
When I removed the second export line, I started getting the error. I'm probably using some of the types defined in file-b somewhere else, and that's somehow causing the error. Anyway, adding the line back in fixed it (I had removed it by accident anyway).
UPDATE
I also go this by referencing some arg types that didn't exist (typo). For example:
args: {
where: arg({ type: 'ConversationsWhereInput' }),
orderBy: arg({ type: 'ConversationsOrderByInput', list: true }),
},
The s in Conversations shouldn't be there. It should be:
args: {
where: arg({ type: 'ConversationWhereInput' }),
orderBy: arg({ type: 'ConversationOrderByInput', list: true }),
},

It could have been much helpful if you could elaborate what did you do before this error happened.
I just came up with this error exactly as same as what you got and for me it was because I just accidentally changed the name for objectType of typeDef.
For instance, the name for the FollowUserResult was actually FollowResult and after I changed the name, the whole mutation resolvers related to this objectType became wrong.
export const FollowUserResult = objectType({
name: "FollowUserResult", // It was originally "FollowResult"
definition(t) {
t.nonNull.boolean("ok");
t.string("error");
},
});
You may check on this again. Once you got those correct, delete the schema.graphql file and generate the new schema.graphql file.

I got this error because I was using the incorrect Node version. My project didn't have a .nvrmc file (yet) so I was using Node 10 on a project that uses Node 14. So after switching to the correct Node version this error went away

I got this error when I incorrectly specified an unknown type in the type attribute of an extension to the Query type. More specifically:
export const CoursesQuery = extendType({
type: "Query",
definition(t) {
t.field("myQuery", {
type: InvalidType, // <--- Changing InvalidType to the correct type fixed the error
async resolve(_parent, _args, ctx) {
return ...
},
});
},
});
The error disappeared after I changed InvalidType to the correct type and restarted the server.

Related

Jest error with <Trans>: You forgot to export your component from the file it's defined in, or you might have mixed up default and named imports

Error: Uncaught [Error: Element type is invalid: expected a string
(for built-in components) or a class/function (for composite
components) but got: undefined. You likely forgot to export your
component from the file it's defined in, or you might have mixed up
default and named imports.
This is the error I was getting while running test in jest. React component which is being tested uses <Trans> from react-i18next. When I comment that portion of code, test were working as expected.
The error shown is very very very miss leading.
In my case it was missing mock for <Trans>. While I had mock for react-i18next, but since I had many components to cover with tests, and some of them were using <Trans> and some of them not, I copy/paste test files but totally forgot to check about mock. It took me few hours to notice it, after I replaced <Trans> to text like <Typography> from material-ui...
jest.mock('react-i18next', () => ({
withTranslation: () => (Component: any) => {
Component.defaultProps = {...Component.defaultProps, t: (children: any) => children};
return Component;
},
Trans: ({children}: any) => children, // this line was missing (() => jest.fn() might also work)
}));
Hope it will save some time for some of you :)
I faced the same issue, in order to resolve the issue I mocked the Trans component like this
jest.mock("react-i18next", () => ({
Trans: ({ i18nKey }: { i18nKey: string }) => i18nKey,
}));
Instead of passing the node, we can simply pass the i18nKey.
In my case, I am only checking the key value. Hope it helps!

Error while evaluating a Resource Statement, Unknown resource type: '::coldfusion::site'

So I am creating a module to manage some coldfusion servers.
I my init.pp I am trying to define a default site.
::coldfusion::site { 'default':
site_number => $site_number,
}
The resource is defined in manifest\site.pp as
define coldfusion::site (
$site_number = undef,
)
{
include coldfusion
include coldfusion::params
}
When I run the pdk test unit to run the unit tests for my module I get the error "Unknown resource type: '::coldfusion::site'".
I checked the spec fixtures modules and there is a symlink to the project files so it should be able to resolve the class.
Not sure whether the problem resides.
Thanks again to Matt for a basic but perhaps overly terse message, I spent some time looking over the docs again and after changing the resource definition from Class to Define and then removing the site_spec since it is no longer a class everything is working.
::coldfusion::site { 'default': site_number => $site_number,}
change above to
coldfusion::site { 'default': site_number => $site_number, }
:: can be used with top scope resources or change your define which should starts with ::

Puppet nested resources create_resources, can't convert string into hash

trying to build a DNS with this module: ref. But getting this error:
Error: Could not retrieve catalog from remote server: Error 500 on SERVER: Server Error: Evaluation Error: Error while evaluating a Function Call, can't convert String into Hash.
I have nested YAML, but not sure if it's correctly formatted or not or problems with something else within my code.
This is my dns profile dns.pp:
class profile::bind {
validate_hash($conf)
$conf = hiera_hash('bind::zone', undef)
create_resources('profile::bind::make::zone', $conf)
}
This is how I define my zone with make_zone.pp:
define profile::bind::make::zone (
$hash_data,
$zone,
$ensure,
$zone_contact,
$zone_ns,
$zone_serial,
$zone_ttl,
$zone_origin,
) {
validate_hash($hash_data)
bind::zone { $zone :
ensure => $ensure,
zone_contact => $zone_contact,
zone_ns => [$zone_ns],
zone_serial => $zone_serial,
zone_ttl => $zone_ttl,
zone_origin => $zone_origin,
}
}
This is my host1.yaml data:
---
version: 5
bind::zone:
zone: test.ltd
ensure: present
zone_contact: 'contact.test.ltd'
zone_ns:
-'ns0.test.ltd'
-'ns1.test.ltd'
zone_serial: '2018010101'
zone_ttl: '767200'
zone_origin: 'test.ltd'
hash_data:
"newyork":
owner: "11.22.33.44"
"tokyo":
owner: "22.33.44.55"
"london":
owner: "33.44.55.66"
bind::cname:
ensure: present
record_type: master
There are a number of mistakes and misunderstandings in the code. I fixed them up so that the code at least compiles and ended up with this.
Changes to profile::bind:
class profile::bind {
include bind
$conf = lookup('bind::zone')
create_resources(profile::bind::make::zone, $conf)
}
Changes to profile::bind::make::zone:
define profile::bind::make::zone (
Enum['present','absent'] $ensure,
String $zone_contact,
Array[String] $zone_ns,
String $zone_serial,
String $zone_ttl,
String $zone_origin,
Hash[String, Hash[String, String]] $hash_data,
) {
bind::zone { $name:
ensure => $ensure,
zone_contact => $zone_contact,
zone_ns => $zone_ns,
zone_serial => $zone_serial,
zone_ttl => $zone_ttl,
zone_origin => $zone_origin,
}
}
Changes to host1.yaml:
---
bind::zone:
'test.ltd':
ensure: present
zone_contact: 'contact.test.ltd'
zone_ns:
- 'ns0.test.ltd'
- 'ns1.test.ltd'
zone_serial: '2018010101'
zone_ttl: '767200'
zone_origin: 'test.ltd'
hash_data:
"newyork":
owner: "11.22.33.44"
"tokyo":
owner: "22.33.44.55"
"london":
owner: "33.44.55.66"
Some explanation:
immediate problem:
Error: Could not retrieve catalog from remote server: Error 500 on SERVER: Server Error: Evaluation Error: Error while evaluating a Function Call, can't convert String into Hash.
This error was caused because your Hiera data was not correctly structured as a Hash[String, Hash[String, String]]. Notice in the yaml I have removed your key "zone" and created a nested Hash there.
must include the bind class
The camptocamp BIND module requires the bind class to also be declared. See its documentation.
validate_hash function is legacy and in the wrong place
As John Bollinger mentioned in the comment, you had the validate_hash on the wrong line. I think that was a cut/paste issue, because you would have got a different error message if that was really your code. Anyway, since you're using Puppet 5 (I guess that by the version => 5 in your Hiera), don't use the legacy validate functions ; use Puppet's data type validation. So I just deleted that line.
use lookup() instead of hiera_hash()
Again, since you're using Puppet 5, use the lookup() function instead of the deprecated hiera_hash() function.
version 5 belongs in hiera.yaml, not host1.yaml
It won't cause you any problems, but the line version: 5 won't do anything here, and it belongs in your hiera.yaml file. I used a hiera.yaml file as follows for testing:
---
version: 5
defaults:
datadir: data
data_hash: yaml_data
hierarchy:
- name: "Host 1"
paths:
- host1.yaml
zone_ns type confusion
You had 2 problems with the zone_ns - firstly, a typo in your YAML (no space after the -) ; and secondly, you passed in an Array of zone NS's and then tried to coerce the array to an array in your defined type.
zone parameter should be the name var
Notice I had to delete the $zone parameter in your defined type, and I used the special $name variable instead, to get the name from the title.
refactored to use data type validation
Notice how I used Puppet's data type validation on your inputs in the defined type, and then I had no further need for the legacy validate_hash function and other related validate functions. Read more about that here.
I think that's all. Hope that helps!

How to make 'testPattern' mandatory while updating snapshots in Jest?

Snapshot testing comes handy for testing UI components. If your UI component changes, you are expected to update the snapshot as well to reflect the same. We can specify 'testNamePattern' to update snapshots for a specific test.
jest --updateSnapshot --testNamePattern abc.test.js
Is it possible to mandate 'testNamePattern' while updating snapshots? This will help avoid updating other failing snapshots by mistake. I understand that it is expected to be caught in code review phase. However, I want to ensure that snapshots are always updated for a specific pattern.
As of now, there isn't any CLI option for doing this per doc. I have added a small snippet to my testFrameworkScriptFile to ensure that testNamePattern is passed while updating snapshots.
import yargs from 'yargs';
const mandateTestNamePattern = () => {
const args = yargs.option('testNamePattern', {
type: 'string'
}).option('t', {
type: 'string'
}).argv;
if (args.updateSnapshot || args.u) {
if (args.testNamePattern || args.t) {
// valid case
} else {
throw new Error('TestNamePattern is mandatory while updating snapshots');
}
}
};
mandateTestNamePattern();

Property 'ensure' does not exist on type 'NodeRequire'

I'm trying webpack 2 code splitting.
According to this doc: https://webpack.js.org/guides/code-splitting-require/
the following code should include some.css into a new chunk named 'something'
require.ensure([], function(require) {
require('some.css');
}, 'something');
but when I run it, I get this error:
ERROR in ./src/index.ts
(4,9): error TS2339: Property 'ensure' does not exist on type 'NodeRequire'.
Any idea about how to fix it?
Thanks
The way I solved this was by creating my own interface - WebpackRequire - which extends NodeRequire with ensure1.
interface WebpackRequire extends NodeRequire {
ensure(
dependencies: string[],
callback: (require: WebpackRequire) => void,
errorCallback?: (error: Error) => void,
chunkName?: string
): void;
};
If you've only got a single instance of require.ensure, you can then type cast it to a WebpackRequire using (require as WebpackRequire).ensure, but since I used it multiple times in a module, I created local require at the top scope of the module, type cast as WebpackRequire, like this:
const require: WebpackRequire = (window as any).require;
1I got the types of ensure from the Webpack docs
I required a javascript document which then did the require. Not exactly the nicest solution, but it did work

Resources