.netcore on linux odbc connection to hive - linux

I'm trying to connect to Hive from .netcore (v. 1.0.1) app on linux (Red Hat 7.3 64-bit).
Program.cs code:
using System;
using System.Data.Odbc;
using System.Collections;
using System.Collections.Generic;
namespace app
{
class Program
{
static void Main(string[] args)
{
var connectionString = "DSN=Hadoop Hive";
var createTableCommandText = "CREATE TABLE Searches(searchTerm STRING, userid BIGINT,userIp STRING) " +
"COMMENT 'Stores all searches for data' " +
"PARTITIONED BY(searchTime DATE) " +
"STORED AS SEQUENCEFILE;";
using (var connection = new OdbcConnection(connectionString))
{
using (var command = new OdbcCommand(createTableCommandText, connection))
{
try
{
connection.Open();
// Create a table.
command.ExecuteNonQuery();
// Insert row of data.
command.CommandText = "INSERT INTO TABLE Searches PARTITION (searchTime = '2015-02-08') " +
"VALUES ('search term', 1, '127.0.0.1')";
command.ExecuteNonQuery();
// Reading data from Hadoop.
command.CommandText = "SELECT * FROM Searches";
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
for (var i = 0; i < reader.FieldCount; i++)
{
Console.WriteLine(reader[i]);
}
}
}
}
catch (OdbcException ex)
{
Console.WriteLine(ex.Message);
throw;
}
finally
{
Drop table
command.CommandText = "DROP TABLE Searches";
command.ExecuteNonQuery();
}
}
}
My project settings file:
app.csproj
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>netcoreapp1.1</TargetFramework>
<PackageId>HadoopLibrary</PackageId>
<NetStandardImplicitPackageVersion>1.6.0</NetStandardImplicitPackageVersion>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="System.Data.Common" Version="4.3.0" />
<PackageReference Include="System.Threading.Thread" Version="4.3.0" />
<PackageReference Include="System.Collections.NonGeneric" Version="4.0.1" />
<PackageReference Include="MSA.NetCore.ODBC" Version="1.0.3" />
</ItemGroup>
When running "dotnet run" I receive the following error:
Unhandled Exception: System.DllNotFoundException: Unable to load DLL 'odbc32.dll': The specified module could not be found.
(Exception from HRESULT: 0x8007007E)
at System.Data.Odbc.libodbc.SQLAllocHandle(OdbcHandleType HandleType, IntPtr InputHandle, IntPtr& OutputHandlePtr)
at System.Data.Odbc.OdbcConnection.Open()
at hwapp.Program.Main(String[] args)
Can anyone help how to fix that? The driver (Hortonworks ODBC for Hive) works on this server. There is an issue with the library for ODBC connection here.

it looks like you are trying to build and load it as a 32-bit application and that's what Hortonworks ODBC for Hive may not support. Need to check the documentation of Hortonworks for that.

Related

unable to share image in Whatsapp with error File format not supported for Android 11

The code is ok for phone with Android Q. When testing on phone with Android OS 11 or above. The code is not working.
Android studio: 3.6.3
Gradle version: 3.6.4
in Manifest, added the following:
<queries>
<package android:name="com.whatsapp" />
<package android:name="com.whatsapp.w4b" />
<intent>
<action android:name="android.intent.action.SEND"/>
<data android:mimeType="image/jpeg" />
</intent>
</queries>
<uses-permission android:name="android.permission.QUERY_ALL_PACKAGES" tools:ignore="QueryAllPackagesPermission"/>
In code:
bitmapCard has image in it.
GCbtnSocialMedia.setOnClickListener {
val intent = Intent()
intent.action = Intent.ACTION_SEND
intent.type = "image/*"
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) {
val cache = this.getExternalCacheDir()
val sharefile = File(cache, "toshare.png")
val out = FileOutputStream(sharefile)
bitmapCard!!.compress(Bitmap.CompressFormat.PNG, 100, out)
out.flush()
out.close()
intent.putExtra(Intent.EXTRA_STREAM, Uri.parse("content://" + sharefile))
startActivity(Intent.createChooser(intent, "Image Sharing"))
} else {
//***** for android OS < android X
intent.putExtra(Intent.EXTRA_STREAM,getImageUrl(this, bitmapCard!!))
startActivity(Intent.createChooser(intent,"Image Sharing"))
}
}
private fun getImageUrl(inContext: Context, inImage: Bitmap): Uri? {
val bytes = ByteArrayOutputStream()
inImage.compress(Bitmap.CompressFormat.JPEG, 100, bytes)
val path = MediaStore.Images.Media.insertImage(inContext.contentResolver,inImage,"Social Media", null)
return Uri.parse(path)
}

How to use latest SharePoint PnP Core Online in azure function v1

I just follow the instructions as below:
https://github.com/Azure/azure-functions-vs-build-sdk
tried to add new version of newtonsoft.json and then installed the latest SharePointPnPCoreOnline.
It works well in my project and also I could do a unit test for my event-grid trigger locally.
But after I deploy to azure,an error will happen.It seems the function did not load the proper DLL
Method not found: 'Newtonsoft.Json.Linq.JObject Microsoft.Azure.WebJobs.Extensions.EventGrid.EventGridEvent.get_Data()'.
and executed this code when error
[FunctionName("ProcessRequest")]
[Obsolete]
public static void Run([EventGridTrigger] string eventGridEvent, TraceWriter log)
{
EventGridEvent eventGridEventData = JsonConvert.DeserializeObject<EventGridEvent>(eventGridEvent);
var siteCreationInfo = eventGridEventData.Data.ToObject<SiteRequest>();
}
very confused about the issue and made all my solutions but could not find the way.
Under this condition and if we have to use both of these libraries,it seems we could not to convert the object to eventgrid object directly.
EventGridEvent eventGridEventData = eventGridEvent.ToObject<EventGridEvent>();
because of the libraries conflict,we can not use this function directly.
We should get the key and value separately:
JObject eventGridData = JObject.Parse(eventGridEvent);
var eventId = eventGridData["id"];
var siteData = eventGridData["data"];
we should do data conversion in the simple way
The solution to overcoming this issue is by first manually installing a newer version of Newtonsoft.Json via NuGet.
Check the references.
My test project, it has warnings, but code runs successfully.
string webTitle = string.Empty;
JObject jObject = JObject.Parse(#"{
'CPU': 'Intel',
'Drives': [
'DVD read/writer',
'500 gigabyte hard drive'
]
}");
try
{
//Create the client context
using (var clientContext = authenticationManager.GetSharePointOnlineAuthenticatedContextTenant(authArray[0], authArray[1], authArray[2]))
{
var web = clientContext.Web;
clientContext.Load(web);
clientContext.ExecuteQuery();
Console.WriteLine(web.Title);
webTitle = web.Title;
}
}
catch (Exception ex)
{
Console.WriteLine("Exception : " + ex.Message);
webTitle = ex.Message;
}
return req.CreateResponse(HttpStatusCode.OK, "Hello " + webTitle+ jObject["CPU"]);

Azure SQL Sync is a resource drain on laptop despite frequency setting

We have a SQL server running on a development laptop and would like to we'd like to deploy multiple laptops and use Azure Sync to distribute changes to each user. Performance is too slow to have a remote SQL database with our current application -- that's a separate issue. Speed isn't critical and I don't anticipate collisions between users. I set the update to 12 hours and conflict resolution to member win.
Everything seems to be working as intended except the Microsoft SQL Data Sync 2.0 Windows service process is continuously consuming 2%-5% of the CPU and streaming data at about 80 kbps continuously. I'm worried this will drain the batteries when the laptops are in the field. Is there a better way to do this?
Here's the resource utilization from the Azure database. I stopped the sync at some point to see if the frequency setting would automatically restart it (it does not).
You can write your own sync application based on your needs. This and this resources provides you a good guidance. Below you will see a sample application.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Data.SqlClient;
using Microsoft.Synchronization.Data.SqlServer;
using Microsoft.Synchronization.Data;
using Microsoft.Synchronization;
namespace SQLAzureDataSync
{
class Program
{
public static string sqlazureConnectionString = "Server=[Your SQL Azure Server].database.windows.net;Database=AdventureWorksLTSQLAzure;User ID=[Your SQL Azure User Name]#[Your SQL Azure Server];Password=[Your SQL Azure Password];Trusted_Connection=False;Encrypt=True;";
public static string sqllocalConnectionString = "Server=(local);Database=AdventureWorksLT2008;Trusted_Connection=True";
public static readonly string scopeName = "alltablesyncgroup";
static void Main(string[] args)
{
// Test if input arguments were supplied:
if (args.Length == 0)
{
System.Console.WriteLine("Please enter an argument.");
System.Console.WriteLine("Usage: SyncTest.exe -setup");
System.Console.WriteLine(" SyncTest.exe -sync");
}
else if (args[0] == "-setup")
Setup();
else if (args[0] == "-sync")
Sync();
}
public static void Setup()
{
try
{
SqlConnection sqlServerConn = new SqlConnection(sqllocalConnectionString);
SqlConnection sqlAzureConn = new SqlConnection(sqlazureConnectionString);
DbSyncScopeDescription myScope = new DbSyncScopeDescription(scopeName);
DbSyncTableDescription Customer = SqlSyncDescriptionBuilder.GetDescriptionForTable("Customer", sqlServerConn);
DbSyncTableDescription Product = SqlSyncDescriptionBuilder.GetDescriptionForTable("Product", sqlServerConn);
// Add the tables from above to the scope
myScope.Tables.Add(Customer);
myScope.Tables.Add(Product);
// Setup SQL Server for sync
SqlSyncScopeProvisioning sqlServerProv = new SqlSyncScopeProvisioning(sqlServerConn, myScope);
if (!sqlServerProv.ScopeExists(scopeName))
{
// Apply the scope provisioning.
Console.WriteLine("Provisioning SQL Server for sync " + DateTime.Now);
sqlServerProv.Apply();
Console.WriteLine("Done Provisioning SQL Server for sync " + DateTime.Now);
}
else
Console.WriteLine("SQL Server Database server already provisioned for sync " + DateTime.Now);
// Setup SQL Azure for sync
SqlSyncScopeProvisioning sqlAzureProv = new SqlSyncScopeProvisioning(sqlAzureConn, myScope);
if (!sqlAzureProv.ScopeExists(scopeName))
{
// Apply the scope provisioning.
Console.WriteLine("Provisioning SQL Azure for sync " + DateTime.Now);
sqlAzureProv.Apply();
Console.WriteLine("Done Provisioning SQL Azure for sync " + DateTime.Now);
}
else
Console.WriteLine("SQL Azure Database server already provisioned for sync " + DateTime.Now);
sqlAzureConn.Close();
sqlServerConn.Close();
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
}
public static void Sync()
{
try
{
SqlConnection sqlServerConn = new SqlConnection(sqllocalConnectionString);
SqlConnection sqlAzureConn = new SqlConnection(sqlazureConnectionString);
SyncOrchestrator orch = new SyncOrchestrator
{
LocalProvider = new SqlSyncProvider(scopeName, sqlAzureConn),
RemoteProvider = new SqlSyncProvider(scopeName, sqlServerConn),
Direction = SyncDirectionOrder.UploadAndDownload
};
Console.WriteLine("ScopeName={0} ", scopeName.ToUpper());
Console.WriteLine("Starting Sync " + DateTime.Now);
ShowStatistics(orch.Synchronize());
sqlAzureConn.Close();
sqlServerConn.Close();
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
}
public static void ShowStatistics(SyncOperationStatistics syncStats)
{
string message;
message = "\tSync Start Time :" + syncStats.SyncStartTime.ToString();
Console.WriteLine(message);
message = "\tSync End Time :" + syncStats.SyncEndTime.ToString();
Console.WriteLine(message);
message = "\tUpload Changes Applied :" + syncStats.UploadChangesApplied.ToString();
Console.WriteLine(message);
message = "\tUpload Changes Failed :" + syncStats.UploadChangesFailed.ToString();
Console.WriteLine(message);
message = "\tUpload Changes Total :" + syncStats.UploadChangesTotal.ToString();
Console.WriteLine(message);
message = "\tDownload Changes Applied :" + syncStats.DownloadChangesApplied.ToString();
Console.WriteLine(message);
message = "\tDownload Changes Failed :" + syncStats.DownloadChangesFailed.ToString();
Console.WriteLine(message);
message = "\tDownload Changes Total :" + syncStats.DownloadChangesTotal.ToString();
Console.WriteLine(message);
}
}
}

Saving a stream in TempData MVC5 C#

Hi i am uploading a large file using asp.net mvc5 to server and in controler i am saving this stream into TempData.on my local machine it is working fine and successfully uploading large files.but on hosted server it fails.i have tried it with increasing MaxAllowedContent and ExcutionTimeOut Limit but none of them worked.Can anyone help me on this.
Here is my controller action
public ActionResult UploadTransactionPDF()
{
private readonly List<MemoryStream> _PdfImage = new List<MemoryStream>();
private readonly List<string> _PdfImageNames = new List<string>();
try
{
Gateway.Instance.Logger.LogInfo(string.Format(_formatProvider,"[TransactionController -> UploadTransactionPDF ]: method started : Save pdf to memory."));
if (CheckFileExist())
{
if (TempData[TransactionRef.PdfFiles.ToString()] != null)
{
_PdfImage.Clear();
_PdfImage.AddRange(TempData[TransactionRef.PdfFiles.ToString()] as List<MemoryStream>);
}
if (TempData[TransactionRef.PdfFileName.ToString()] != null && !string.IsNullOrEmpty(TempData[TransactionRef.PdfFileName.ToString()].ToString()))
{
_PdfImageNames.Clear();
_PdfImageNames.AddRange(TempData[TransactionRef.PdfFileName.ToString()] as List<string>);
}
for (int a = 0; a < Request.Files.Count; a++)
{
if (!_PdfImageNames.Contains(Request.Files[a].FileName))
{
MemoryStream ms = new MemoryStream();
Request.Files[a].InputStream.CopyTo(ms);
_PdfImage.Add(ms);
_PdfImageNames.Add(Request.Files[a].FileName);
}
}
TempData[TransactionRef.PdfFiles.ToString()] = _PdfImage;
TempData[TransactionRef.PdfFileName.ToString()] = _PdfImageNames;
ViewBag.Info = "pdf";
ViewBag.ImageUploadInfo = Language.ImageSavedMessage.ToString();
}
else
{
ViewBag.ImageUploadInfo = Language.NoFileMessage.ToString();
}
ViewBag.FileNames = _PdfImageNames;
Gateway.Instance.Logger.LogInfo(string.Format(_formatProvider, "[TransactionController -> UploadTransactionPDF ] : method exited ."));
return View("_PDFFiles");
}
catch (Exception ex)
{
string errorInfo = string.Format(_formatProvider, "[TransactionController -> UploadTransactionPDF ] : Error : '{0}' occurred while saving pdf to memory.",ex.Message);
Gateway.Instance.Logger.LogError(errorInfo,ex);
throw new Exception(errorInfo, ex);
}
}
Thanks!
can you see if the following exists in your web.config on the dev enviroment versus on the server. ddddd should be the max your request should allow. don't make it to high to where you open yourself up for DOS. the request length wouldn't just be the file size but the entire request from the client to the server.
<configuration>
<system.web>
<httpRuntime targetFramework="4.5" maxRequestLength="ddddd" />
</system.web>
</configuration>
you can localize it to the one controller action by adding a location element in between the configuration and system.web elements.

GetDeleteCommand DataAdapter with a temporary table in Sybase

Porting MSSQL Application to Sybase (ASE 15.0), and experiencing a problem when I call GetDeleteCommand.
The error reported is:
Dynamic SQL generation for the DeleteCommand is not supported against
a SelectCommand that does not return any key column information.
The problem only occurs for temporary table, identical non-temporary table works fine.
Table contains a primary key.
Reproduced using test program below.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Data.OleDb;
using System.Data;
namespace DataAdapterTempTable
{
class Program
{
static void Main(string[] args)
{
String ConnectionString = "Provider=ASEOLEDB;Data Source=devsun3:5003;Initial Catalog=ctc;User ID=aigtac12;Password=aigtac12;"; // sybase connection string
//String ConnectionString = "Provider=SQLOLEDB;Data Source=fiji;Persist Security Info=False;Initial Catalog=nxgn0811;Integrated Security=SSPI"; // mssql connection string
String TableName = "#alex_temporary_table_test"; // does not work for sybase
//String TableName = "alex_real_table_test"; // works for sybase + mssql
String CreateStatement = "create table " + TableName + " (currency_id varchar(4) primary key, rate decimal(25,6), format char(1))";
String SelectStatement = "select * from " + TableName;
try
{
OleDbConnection con = null;
con = new OleDbConnection(ConnectionString);
con.Open();
OleDbCommand cmd = con.CreateCommand();
cmd.CommandType = CommandType.Text;
cmd.CommandText = CreateStatement;
int count = cmd.ExecuteNonQuery();
OleDbCommand cm1 = con.CreateCommand();
cm1.CommandType = CommandType.Text;
cm1.CommandText = SelectStatement;
OleDbDataAdapter DA2 = new OleDbDataAdapter(cm1);
DataTable DT2 = new DataTable();
DA2.FillSchema(DT2, SchemaType.Mapped);
OleDbCommandBuilder cmdbldr = new OleDbCommandBuilder(DA2);
DA2.InsertCommand = cmdbldr.GetInsertCommand();
DA2.DeleteCommand = cmdbldr.GetDeleteCommand(); // this line fails in sybase for temporary table
DA2.UpdateCommand = cmdbldr.GetUpdateCommand();
DA2.Fill(DT2);
}
catch (Exception e)
{
Console.WriteLine(e);
}
}
}
}
In the select statement, instead of * use the column names.
Contacted Sybase support, turns out I had to update some system stored procedures. There is a folder that ends with "oledb\sp", and I had to run a .bat file from the folder. I got the latest ebf and ran the batch file install_oledb_sprocs.bat, the problem went away. Worth mentioning, that sybase 15.5 did not have the issue without patching.
P.S. Thank you to 'aF' for your time looking into the issue.

Resources