Ecto join query causing (Poison.EncodeError) unable to encode value - ecto

I have 3 tables:
categories
subcategories
categorySubcategories
1 category has 0 to many subcategories.
I have this function which gets all entries from categories table and only gets the entries of the subcategories table that belong to the category which has an id of 1:
def getCategories(conn) do
categories = all Api.Category
groceryItemSubcategories = from s in Api.Subcategory,
join: cs in Api.CategorySubcategory, on: cs.c_id == 1,
select: %{name: s.name, foo: cs.c_id}
conn
|> put_resp_content_type("application/json")
|> send_resp(200, Poison.encode!(%{categories: categories, groceryItemSubcategories: groceryItemSubcategories}))
end
giving this error:
23:10:27.169 [error] #PID<0.339.0> running Api.Router terminated
Server: localhost:4000 (http)
Request: GET /categories
** (exit) an exception was raised:
** (Poison.EncodeError) unable to encode value: {"subcategories", Api.Subcategory}
(poison) lib/poison/encoder.ex:383: Poison.Encoder.Any.encode/2
(poison) lib/poison/encoder.ex:227: anonymous fn/4 in Poison.Encoder.Map.encode/3
(poison) lib/poison/encoder.ex:228: Poison.Encoder.Map."-encode/3-lists^foldl/2-0-"/3
(poison) lib/poison/encoder.ex:228: Poison.Encoder.Map.encode/3
(poison) lib/poison/encoder.ex:227: anonymous fn/4 in Poison.Encoder.Map.encode/3
(poison) lib/poison/encoder.ex:228: Poison.Encoder.Map."-encode/3-lists^foldl/2-0-"/3
(poison) lib/poison/encoder.ex:228: Poison.Encoder.Map.encode/3
(poison) lib/poison.ex:41: Poison.encode!/2
The final answer is this:
def getCategories(conn) do
categories = all Api.Category
groceryItemSubcategories = Api.Repo.all(from s in Api.Subcategory,
join: cs in Api.CategorySubcategory, on: cs.s_id == s.id,
join: c in Api.Category, on: c.id == cs.c_id,
where: c.id == 1,
select: %{name: s.name, foo: cs.c_id}
)
conn
|> put_resp_content_type("application/json")
|> send_resp(200, Poison.encode!(%{categories: categories, groceryItemSubcategories: groceryItemSubcategories}))
end
The part that makes the error go away is wrapping the statement in Api.Repo.All(). Dogbert was really the one who answered it so I'm not wanting to answer this.

There are two main issues with the original code:
You forgot to call Repo.all in the second query.
You're selecting a tuple in the query and then encoding it to JSON. Poison does handle encoding tuples to JSON. You can select a list or map instead, depending on what data structure you want. Here's how to select a map:
groceryItemSubcategories = Api.Repo.all(from s in Api.Subcategory,
join: cs in Api.CategorySubcategory, on: cs.s_id == s.id,
join: c in Api.Category, on: c.id == cs.c_id,
where: c.id == 1,
select: %{name: s.name, c_id: cs.c_id})

Related

adding limit to my sequelize query is throwing an error

this works properly
let data=await db.company.findAll({
include:{model:db.user,as:'users'},
where:{'$users.id$':7}
});
but as soon as i add limit below, it throws an error
let data=await db.company.findAll({
limit:10,
include:{model:db.user,as:'users'},
where:{'$users.id$':7}
});
here's the error when i execute the one with limit.
{
name: "SequelizeDatabaseError",
parent: {
message: "The multi-part identifier "users.id" could not be bound.",
code: "EREQUEST",
number: 4104,
state: 1,
class: 16,
serverName: "LAPTOP-5ED24571",
procName: "",
lineNumber: 1,
sql: "SELECT [company].*, [users].[id] AS [users.id], [users].[username] AS [users.username], [users].[name] AS [users.name], [users].[address] AS [users.address], [users].[active] AS [users.active], [users].[email] AS [users.email], [users].[phone] AS [users.phone], [users].[job_role] AS [users.job_role], [users].[time_zone] AS [users.time_zone], [users].[country] AS [users.country], [users].[currency] AS [users.currency], [users].[induction] AS [users.induction], [users].[createdAt] AS [users.createdAt], [users].[updatedAt] AS [users.updatedAt], [users].[userRoleId] AS [users.userRoleId], [users->user_company_mm].[id] AS [users.user_company_mm.id], [users->user_company_mm].[userId] AS [users.user_company_mm.userId], [users->user_company_mm].[companyId] AS [users.user_company_mm.companyId], [users->user_company_mm].[createdAt] AS [users.user_company_mm.createdAt], [users->user_company_mm].[updatedAt] AS [users.user_company_mm.updatedAt] FROM (SELECT [company].[id], [company].[name], [company].[address], [company].[city], [company].[country], [company].[state_region], [company].[logo], [company].[numberOfLocations], [company].[numberOfEmployees], [company].[audience_builder], [company].[creator_user_id], [company].[tags], [company].[createdAt], [company].[updatedAt] FROM [company] AS [company] WHERE [users].[id] = 7 ORDER BY [company].[id] OFFSET 0 ROWS FETCH NEXT 10 ROWS ONLY) AS [company] LEFT OUTER JOIN ( [user_company_mm] AS [users->user_company_mm] INNER JOIN [user] AS [users] ON [users].[id] = [users->user_company_mm].[userId]) ON [company].[id] = [users->user_company_mm].[companyId];"
}
Any idea why this is happening? thank you
I'm not necessarily familiar with sequelize.js. But looking at your query statement, you have a WHERE condition in your sub query limiting it to [Users].[Id] = 7. But this doesn't exist in this part of your query context.
...
FROM
(SELECT [company].[id],
[company].[name],
[company].[address],
[company].[city],
[company].[country],
[company].[state_region],
[company].[logo],
[company].[numberOfLocations],
[company].[numberOfEmployees],
[company].[audience_builder],
[company].[creator_user_id],
[company].[tags],
[company].[createdAt],
[company].[updatedAt] FROM [company] AS [company]
/* HERE --> */ WHERE [users].[id] = 7 ORDER BY [company].[id] OFFSET 0 ROWS FETCH NEXT 10 ROWS ONLY
)
AS [company] LEFT OUTER JOIN
( [user_company_mm] AS [users->user_company_mm] INNER JOIN [user]
AS [users] ON [users].[id] = [users->user_company_mm].[userId]) ON [company].[id] = [users->user_company_mm].[companyId]
But why the query is structured as so, I can't tell. That would be something for how you've configured it/how sequalized works.

Groovy Prepared Statement with Named Parameters

I used the below way to do named parameters with JDBC Prepared Statement. Any suggestion to improve this?
import java.sql.*;
def sqlQuery = "select * from table where col1=:col1 and col2=:col2"
def namedParameters =[
['ColumnName':'col1','Value':'test', 'DataType': 'int'],
['ColumnName':'col2','Value':'testsdfdf', 'DataType':'string'],
];
PreparedStatement stmt;
namedParameters.eachWithIndex{ k, v ->
println "Index: " + v
println "Name: " + k.ColumnName
println "Value: " + k.Value
//To replace named parameters with ?
sqlQuery = sqlQuery .replace(":" + k.ColumnName, "?")
println sqlQuery
println "DataType: " + k.DataType
switch(k.DataType.toLowerCase())
{
case('int'):
stmt.setInt(v+1, k.Value)
break;
case('string'):
stmt.setString(v+1, k.Value)
break;
default:
stmt.setObject(v+1, k.Value)
}
};
println "End"
I am doing a string replace to replace named parameter with ?
And based on the provided map, identifying data type setting it to PreparedStatement accordingly
You can use groovy.sql.Sql class and its execute(Map params, String query, Closure processResults method. Consider following exemplary script:
sql.groovy:
#Grab(group='com.h2database', module='h2', version='1.4.197')
import groovy.sql.Sql
// (1) Configure JDBC connection (use H2 in-memory DB in this example)
def config = [
url:'jdbc:h2:mem:test',
user:'sa',
password:'',
driver: 'org.h2.Driver'
]
// (2) Connect to the database
def sql = Sql.newInstance(config.url, config.user, config.password, config.driver)
// (3) Create table for testing
sql.execute '''
create table test (
id integer not null,
name varchar(50)
)
'''
// (4) Insert some test data
def query = 'insert into test (id, name) values (?,?)'
sql.withBatch(3, query) { stmt ->
stmt.addBatch(1, 'test 1')
stmt.addBatch(2, 'test 2')
stmt.addBatch(3, 'test 3')
}
// (5) Execute SELECT query
sql.execute([id: 1, name: 'test 1'], 'select * from test where id >= :id and name != :name', { _, result ->
result.each { row ->
println "id: ${row.ID}, name: ${row.NAME}"
}
})
The last part shows how you can use prepared statement with named parameters. In this example we want to list all rows where id is greater or equal 1 and where name does not equal test 1.
The first parameter of sql.execute() method is a map that holds your named parameters (each key maps to :key in the SQL query). The second parameter is your SQL query where :key format is used for named parameters. And the third parameter is a closure that defines processing business logic - result holds a list of maps (e.g. [[ID: 2, NAME: test 2], [ID:3 name: test 3]] in this case) and you have to define how to process this result.
Console output
id: 2, name: test 2
id: 3, name: test 3
sql.eachRow() alternative
Alternatively you can use sql.eachRow(String sql, Map params, Closure closure) instead:
sql.eachRow('select * from test where id > :id', [id: 1], { row ->
println "id: ${row.ID}, name: ${row.NAME}"
})
It will produce the same output.
With Groovy SQL you can even use GString as SQL query.
Example
// Define bind variables
def keyX = 1
def keyY = 'row1'
Query
groovyCon.eachRow("select x,y from mytab where x = ${keyX} and y = ${keyY}") {println it}
This send following query to DB:
select x,y from mytab where x = :1 and y = :2
Groovy SQl is very handy and usefull, except for some special cases (e.g. you need to reuse a preparedStatement in a loop) where you must fallback to plain JDBC.

Ecto - Update a Record - undefined function __changeset__/0

I'm getting this error when trying to update a record with a changeset:
14:36:29.972 [error] #PID<0.341.0> running Api.Router terminated
Server: 192.168.20.3:4000 (http)
Request: PUT /products/?p_id=11&s_id=11
** (exit) an exception was raised:
** (UndefinedFunctionError) function Ecto.Query.__changeset__/0 is undefined or private
(ecto) Ecto.Query.__changeset__()
(ecto) lib/ecto/changeset.ex:422: Ecto.Changeset.do_cast/4
(api) lib/api/product_shop.ex:17: Api.ProductShop.changeset/2
(api) lib/api/router.ex:168: anonymous fn/1 in Api.Router.do_match/4
(api) lib/api/router.ex:1: Api.Router.plug_builder_call/2
(api) lib/plug/debugger.ex:123: Api.Router.call/2
(plug) lib/plug/adapters/cowboy/handler.ex:15: Plug.Adapters.Cowboy.Handler.upgrade/4
(cowboy) /Users/Ben/Development/Projects/vepo/api/deps/cowboy/src/cowboy_protocol.erl:442: :cowboy_protoco
l.execute/4
code:
pid = conn.query_params["p_id"]
sid = conn.query_params["s_id"]
price = conn.query_params["price"]
query = ProductShop |> Ecto.Query.where(p_id: ^pid)
product_shop = query |> Ecto.Query.where(s_id: ^sid)
changeset2 = Api.ProductShop.changeset(product_shop, %{price: price})
case Api.Repo.update(changeset2) do
{:ok, product_shop} ->
errors = Tuple.append(errors, "Price updated")
{:error, changeset2} ->
errors = Tuple.append(errors, "Price not updated")
end
this is the ProductShop which I want to update:
14:38:56.658 [debug] QUERY OK source="product_shops" db=1.7ms
SELECT p0."id", p0."s_id", p0."p_id", p0."not_in_shop_count", p0."price" FROM "product_shops" AS p0 []
[%Api.ProductShop{__meta__: #Ecto.Schema.Metadata<:loaded, "product_shops">,
id: 11, not_in_shop_count: 0, p_id: 11, price: 5.99, s_id: 11}]
Why am I getting the error?
my ProductShop file with changeset:
defmodule Api.ProductShop do
use Ecto.Schema
import Ecto.Changeset
import Api.Repo
import Ecto.Query
#derive {Poison.Encoder, only: [:s_id, :p_id]}
schema "product_shops" do
field :s_id, :integer
field :p_id, :integer
field :not_in_shop_count, :integer
field :price, :float
end
def changeset(product_shop, params \\ %{}) do
product_shop
|> cast(params, [:s_id, :p_id])
|> validate_required([:s_id, :p_id])
|> unique_constraint(:s_id, name: :unique_product_shop)
end
def insert_product_shop(conn, product_id, shop_id, price) do
changeset = Api.ProductShop.changeset(%Api.ProductShop{p_id: product_id, s_id: shop_id, not_in_shop_count: 0, price: price})
errors = changeset.errors
valid = changeset.valid?
case insert(changeset) do
{:ok, product_shop} ->
{:ok, product_shop}
{:error, changeset} ->
{:error, :failure}
end
end
def delete_all_from_product_shops do
from(Api.ProductShop) |> delete_all
end
def get_product_shops do
Api.ProductShop |> all
end
end
in router.ex
put "/products" do
errors = {}
IO.inspect(conn.body_params)
product = Api.Product |> Api.Repo.get(conn.query_params["p_id"])
shop = Api.Shop |> Api.Repo.get(conn.query_params["s_id"])
params = for key <- ~w(image description), value = conn.body_params[key], into: %{}, do: {key, value}
changeset = Api.Product.changeset(product, params)
case Api.Repo.update(changeset) do
{:ok, product} ->
errors = Tuple.append(errors, "Product updated")
{:error, changeset} ->
errors = Tuple.append(errors, "Product not updated")
end
pid = conn.query_params["p_id"]
sid = conn.query_params["s_id"]
price = conn.query_params["price"]
query = ProductShop |> Ecto.Query.where(p_id: ^pid)
product_shop = query |> Ecto.Query.where(s_id: ^sid)
changeset2 = Api.ProductShop.changeset(product_shop, %{price: price})
case Api.Repo.update(changeset2) do
{:ok, product_shop} ->
errors = Tuple.append(errors, "Price updated")
{:error, changeset2} ->
errors = Tuple.append(errors, "Price not updated")
end
IO.inspect(errors)
conn
|> put_resp_content_type("application/json")
|> send_resp(200, Poison.encode!(%{
successs: "success",
errors: Tuple.to_list(errors)
}))
end
Ecto's changeset function that you write in schema, by default works for Ecto.Schema, which means that it works for modules with defined schemas in them. After using cast it deals with Ecto.Changeset struct.
Your code tries to work with Ecto.Query in changeset function, namely here:
product_shop = query |> Ecto.Query.where(s_id: ^sid)
You should use Repo.one() at the end to have valid ProductShop struct and then you can use it in ProductShop.changeset function.
Also consider rewriting how you want to retrieve this product_shop. Please use Repo.get_by:
Repo.get_by(ProductShop, s_id: s_id, p_id: p_id)

JSON encode GEO.Point from geo library as human readable form

I have this schema which has a Geo Geo.Point:
defmodule Api.Shop do
use Ecto.Schema
import Ecto.Changeset
import Api.Repo
import Ecto.Query
#derive {Poison.Encoder, only: [:name, :place_id, :geo_json, :distance]}
schema "shops" do
field :name, :string
field :place_id, :string
field :point, Geo.Point
field :geo_json, :string, virtual: true
field :distance, :float, virtual: true
timestamps()
end
def encode_model(shop) do
%Api.Shop{shop | geo_json: Geo.JSON.encode(shop.point) }
end
defimpl Poison.Encoder, for: Api.Shop do
def encode(shop, options) do
shop = Api.Shop.encode_model(shop)
Poison.Encoder.Map.encode(Map.take(shop, [:id, :name, :geo_json]), options)
end
end
def changeset(shop, params \\ %{}) do
shop
|> cast(params, [:name, :place_id, :point])
|> validate_required([:name, :place_id, :point])
|> unique_constraint(:place_id)
end......
end
And when I return the shop.point field in a query:
def create_query_no_keyword(categories, shop_ids) do
products_shops_categories = from p in Product,
distinct: p.id,
join: ps in ProductShop, on: p.id == ps.p_id,
join: s in Shop, on: s.id == ps.s_id,
join: pc in ProductCategory, on: p.id == pc.p_id,
join: c in Subcategory, on: c.id == pc.c_id,
where: c.id in ^categories,
where: s.id in ^shop_ids,
group_by: [p.id, p.name, p.brand],
select: %{product: p, categories: fragment("json_agg( DISTINCT (?, ?)) AS category", c.id, c.name), shops: fragment("json_agg( DISTINCT (?, ?, ?)) AS shop", s.id, s.name, s.point)}
end
What gets returned is actually 0101000020E6100000A3BDB0EB0DD9654030AC2C1BE76D42C0 which is the wrong format - WKB. I'm looking to encode as WKT which has readable coordinates.
How do I get s.point to be WKT format and thus have coordinates, when it is returned by the query?
I found this Stack Exchange GIS answer to be the solution:
use this for point object:
SELECT ST_AsText(the_geom)
FROM myTable; and viewing X,Y and geom object:
SELECT ST_X(the_geom), ST_Y(the_geom), ST_AsText(the_geom)
FROM myTable;
The Geo library is using PostGIS and the solution was PostGIS specific. You need to select the column using ST_AsText, or ST_X and ST_Y from PostGIS.
My select statement changed to this:
select: %{product: p, categories: fragment("json_agg( DISTINCT (?, ?)) AS category", c.id, c.name), shops: fragment("json_agg( DISTINCT (?, ?, ST_X(?), ST_Y(?))) AS shop", s.id, s.name, s.point, s.point)}

Ecto - select all fields from all tables in a 5 table join

I have a products table, a categories table, and a shops table, as well as joining tables product_shops, and product_categories.
I want to select all products that have a category which is in the conn.query_params, that are in shops that are within a certain distance which is specified in the conn.query_params. And I want to return pretty much every field of the product, category, and shop that was selected.
I have this:
get "/products" do
query = conn.query_params
point = %Geo.Point{coordinates: {String.to_float(query["longitude"]), String.to_float(query["latitude"])}, srid: 4326}
shops = within(Shop, point, String.to_float(query["distanceFromPlaceValue"]) * 1000) |> order_by_nearest(point) |> select_with_distance(point) |> Api.Repo.all
categories = Enum.map(query["categories"], fn(x) -> String.to_integer(x) end)
shop_ids = Enum.map(shops, fn(x) -> x.id end)
query1 = from p in Product,
join: ps in ProductShop, on: p.id == ps.p_id,
join: s in Shop, on: s.id == ps.s_id,
join: pc in ProductCategory, on: p.id == pc.p_id,
join: c in Category, on: c.id == pc.c_id,
where: Enum.member?(categories, c.id),
where: Enum.member?(shop_ids, s.id),
select: p, c, s,
group_by s,
order_by s.distance
In the code above, the shops variable holds all the shops that are within the specified distance.
this is inside shop.ex to get all shops within distance, ordered from the closest shop:
def within(query, point, radius_in_m) do
{lng, lat} = point.coordinates
from(shop in query, where: fragment("ST_DWithin(?::geography, ST_SetSRID(ST_MakePoint(?, ?), ?), ?)", shop.point, ^lng, ^lat, ^point.srid, ^radius_in_m))
end
def order_by_nearest(query, point) do
{lng, lat} = point.coordinates
from(shop in query, order_by: fragment("? <-> ST_SetSRID(ST_MakePoint(?,?), ?)", shop.point, ^lng, ^lat, ^point.srid))
end
def select_with_distance(query, point) do
{lng, lat} = point.coordinates
from(shop in query, select: %{shop | distance: fragment("ST_Distance_Sphere(?, ST_SetSRID(ST_MakePoint(?,?), ?))", shop.point, ^lng, ^lat, ^point.srid)})
end
This is the shop schema and the distance field gets populated when the select_with_distance variable is called.
#derive {Poison.Encoder, only: [:name, :place_id, :point]}
schema "shops" do
field :name, :string
field :place_id, :string
field :point, Geo.Point
field :distance, :float, virtual: true
timestamps()
end
My current error is in the select: p, c, s line as I'm unsure how to select the whole lot:
== Compilation error on file lib/api/router.ex ==
** (SyntaxError) lib/api/router.ex:153: syntax error before: c
(elixir) lib/kernel/parallel_compiler.ex:117: anonymous fn/4 in Kernel.ParallelCompiler.spawn_compilers/1
Also unsure if the group_by should be the shop. I just know I want to return all fields and only unique fields, ordered by proximity of the shop. I feel most of the way there but a bit stuck on the select, group_by, and order_by.
EDIT: Thanks Dogbert for fixing those two issues in the comments.
currently code is:
products_shops_categories = from p in Product,
join: ps in ProductShop, on: p.id == ps.p_id,
join: s in Shop, on: s.id == ps.s_id,
join: pc in ProductCategory, on: p.id == pc.p_id,
join: c in Category, on: c.id == pc.c_id,
where: c.id in ^categories,
where: s.id in ^shop_ids,
select: {p, c, s}
group_by s,
order_by s.distance
I have this error now:
** (Ecto.Query.CompileError) `order_by(s.distance())` is not a valid query expression.
* If you intended to call a database function, please check the documentation
for Ecto.Query to see the supported database expressions
* If you intended to call an Elixir function or introduce a value,
you need to explicitly interpolate it with ^
expanding macro: Ecto.Query.group_by/2
lib/api/router.ex:155: Api.Router.do_match/4
(elixir) lib/kernel/parallel_compiler.ex:117: anonymous fn/4 in Kernel.ParallelCompiler.spawn_compilers/1

Resources