I’ve had a few post addressing this before and was so happy to finally see this issue has a quick solution that does NOT rely on Report Viewer/Report Designer (rdlc) and remains portable!
It’s simple using ClosedXML:
I’ve had a few post addressing this before and was so happy to finally see this issue has a quick solution that does NOT rely on Report Viewer/Report Designer (rdlc) and remains portable!
It’s simple using ClosedXML:
Our iSeries (mainframe/as400) team has figured out a way to upgrade all our forms and can easily deposit these forms into a folder on the as400. They are looking for the fastest way to push these forms to our customers.
They ask: Is there a way I can (.NET Core Dev) retrieve their PDF statements from the mainframe (iSeries/AS400) for viewing online? The files are not exposed in Zend (the as400 webserver) but are in a specific “folder” (if you could call it that, the structure of as400 file system is different).
Though I actively employ their Zend system to utilize my own API for querying, etc, there was not much interest in this in their team other than knowing how I was calling their programs. They wanted this SPECIFIC folder to be accessed to retrieve statements.
I tried to propose a HTTP web service via Zend, other HTTP/PHP solutions… but in the end, I explained what I am currently doing: utilizing my own API to accept an authentication key with parameters (in json using PHP) and convert those parameters to the parameters they need and then calling a program. The program I call does the logic and I get a response. (In this case I will get a stream).
In the end, we decided on a combo of passing parameters via my API (Zend/PHP) and then receiving a file name which I will retrieve via FTP.
I know that I can access files via FTP if given access. I know I can get to the file(s) they want using FTP and C#. I do not like it, I insist on many, many security measures but I assure them I can access via FTP and prove it using a FTP client.
It’s decided: I will use Zend (and PHP) to pass parameters (using post/json) and the response I get will be a file name. That file, I have to retrieve off of that specific path on the as400.
First, I followed the ANSWER to this post: https://stackoverflow.com/questions/57236179/how-to-download-xlsx-file-using-ftpwebrequest-through-c-sharp-asp-net
I didn’t want to download, as the second half of his post suggested. I wanted to immediate display (stream) the pdf. This is how I did it:
[HttpGet] public async Task<Stream> GetStatementAsync() { string url = "ftp://1.1.1.1" + "/%2F" + "thisismyfilesharefolder" + "/%2F"; //TODO: REPLACE FILE NAME / MAKE DYNAMIC string ftpFileName = "MYTESTFILE.PDF"; var requestUri = new Uri(url + ftpFileName); var fileBytes = FtpDownloadBinary(requestUri); return new MemoryStream(fileBytes); } //TODO: MOVE TO FTPHELPER SERVICE public byte[] FtpDownloadBinary(Uri fromUri) { //TODO: REPLACE WITH FTPUSER string username = "ASKTHESPAMMER"; string password = "WHOWANTS10K"; FtpWebRequest request = (FtpWebRequest)WebRequest.Create(fromUri); request.Method = WebRequestMethods.Ftp.DownloadFile; request.Credentials = new NetworkCredential(username, password); request.UseBinary = true; request.KeepAlive = true; request.UsePassive = true; //TODO: ENABLE FTPSSL ON AS400 TO MAKE THIS POSSIBLE //request.EnableSsl = true; FtpWebResponse response = (FtpWebResponse)request.GetResponse(); using (var responseStream = response.GetResponseStream()) using (var memoryStream = new MemoryStream()) { responseStream.CopyTo(memoryStream); return memoryStream.ToArray(); } }
Very raw and in house, our team is making that FTPUSER profile, etc. We just wanted to know if it was POSSIBLE.
First error was an obvious authentication error. The standard profile user I used did not have the same permission I do on FTP. Simple way to clarify: I did logged on with both users via FTP client (sorry, I still love FileZilla) and with user one I could get in, the other I could not.
Once we established a user with correct permissions, I received another error.
“(501) Syntax error in parameters or arguments”
I got past authentication but I kept getting an error somewhere on response. In truth, the error is simply saying there’s a syntax error in the REQUEST. The syntax error was a bit hard to find – it ended up being the forward slash in the url. It looked normal, but the mainframe was denying it. If you research this, the posts are so old that many of the links are dead.
Quick tutorial for those still dealing with Mainframe / AS400 / iSeries:
If you feed it a string (url) with a slash, it does NOT understand the slash. You must ESCAPE the slash.
The escape string is this: “/%2F”
in example: google.com/bugs = google.com/%2Fbugs
for devs: string url = “ftp://1.1.1.1” + “/%2F” + “thisismyfilesharefolder” + “/%2F” + filename
It took forever to find the right escape sequence as so much of the documentation is lost, outdated, etc.
After ftp:// EVERY forward slash must look like this: /%2F
In truth, this is not an “escape” sequence, it’s changing directories forward. You can also move backward:
https://stackoverflow.com/questions/330155/how-do-you-change-directories-using-ftpwebrequest-net
Unfortunately, so many of the links with information regarding this are dead.
I’m going to tag everything I can on this on hopes that some of you still working with mainframe find it. Just lending my support as I’m sure I might need yours in the future!
Even though all my projects are in .NET Core now, I rarely get the opportunity to use Identity because of my work with our backend Legacy system. Recently, though, I built a very lightweight SEO Management system for one of our sites (that allows a 3rd party to tweak our page titles, meta tags, etc) and wanted to give them user access and roles.
The entire project can be found on GitHub, but below is just a running list of sites I used to get this done, noting all the troubleshooting and stupid little mistakes I did along the way.
This part was relatively easy and the Microsoft documents provided an easy enough guide. I believe I had some issues:
CS1902 C# Invalid option for /debug; must be full or pdbonly – with the Data Migrations because I didn’t have EntityFramework installed. Basically, all errors in this phase were not as presented – they were mostly because I was lacking packages to migrate.
To ensure I had all the right packages installed, i used: Microsoft.AspNetCore.App -Version 2.2.6
Also, in this area, I decided not to put the connection string in appsettings.json, opting instead to use System Environment Variables both in development and in the future on Azure. I changed my IdentityHostingStartup.cs file to this:
public void Configure(IWebHostBuilder builder) { builder.ConfigureServices((context, services) => { services.AddDbContext<DbContext>(options => options.UseSqlServer(System.Environment.GetEnvironmentVariable("SQLAZURECONNSTR_Production")));
I wanted to test everything first, with no email confirmation. So, using Microsoft Docs I added this email class. The only change is again, I prefer System Environment Variables, so my EmailSender class differs:
public class EmailSender : IEmailSender { private string apiKey = System.Environment.GetEnvironmentVariable("SENDGRID_APIKEY"); public Task SendEmailAsync(string email, string subject, string message) { return Execute(subject, message, email); } public Task Execute(string subject, string message, string email) { var client = new SendGridClient(apiKey); var msg = new SendGridMessage() { From = new EmailAddress("webmaster@mycompany.com", "SEO Management"), Subject = subject, PlainTextContent = message, HtmlContent = message }; msg.AddTo(new EmailAddress(email)); // Disable click tracking. // See https://sendgrid.com/docs/User_Guide/Settings/tracking.html msg.SetClickTracking(false, false); return client.SendEmailAsync(msg); } }
Once I finished with the Microsoft Docs email setup, I started projected, registered 1 user (under email I want to be super admin).
Now, I needed to assign that user the role of super admin and also create other user roles. I used the following answer to get started added both methods to Startup.cs in the Configure method.
app.UseMvc(routes => { routes.MapRoute( name: "default", template: "{controller=Home}/{action=Index}/{id?}"); }); // this is just to seed other admin role, run once //CreateAdminRole(serviceProvider).Wait(); } // this is just to seed other admin role, run once private async Task CreateAdminRole(IServiceProvider serviceProvider) { var RoleManager = serviceProvider.GetRequiredService<RoleManager<IdentityRole>>(); var UserManager = serviceProvider.GetRequiredService<UserManager<ApplicationUser>>(); IdentityResult roleResult; var roleCheck = await RoleManager.RoleExistsAsync("Admin"); if (!roleCheck) { roleResult = await RoleManager.CreateAsync(new IdentityRole("Admin")); } ApplicationUser user = await UserManager.FindByEmailAsync("myemail@mycompany.com"); await UserManager.AddToRoleAsync(user, "Admin"); }
After following that guide, the one error I kept getting here was: No service for type ‘Microsoft.AspNetCore.Identity.RoleManager’.
and another: Unable to resolve service for type ‘Microsoft.AspNetCore.Identity.IRoleStore`1[Microsoft.AspNetCore.Identity.IdentityRole]’ while attempting to activate ‘Microsoft.AspNetCore.Identity.RoleManager`1[Microsoft.AspNetCore.Identity.IdentityRole]’.
The first was because I simply forgot to .AddRoles to the configuration of the DefaultIdentity. The second error is that ORDER MATTERS. AddRoles goes before AddEntityFramework.
Both resolved with this:
services.AddDefaultIdentity<ApplicationUser>() .AddRoles<IdentityRole>() .AddEntityFrameworkStores<ApplicationDbContext>();
Finishing with the guide, I discovered another page that needed changing: _ManageNav which also had a SignIn @inject (like _LoginPartial) that needed to be changed:
@inject SignInManager<ApplicationUser> SignInManager
And then: InvalidOperationException: Unable to resolve service for type ‘Microsoft.AspNetCore.Identity.UserManager`1[
Which, again, is all pointing to the use of IdentityUser instead of your new ApplicationUser. You’ll need to go through each of your Razor pages and change IdentityUser to ApplicationUser.
Which will lead to you scaffolding the views (if you hadn’t already) …
Now that the set up was working, I wanted a better look at my pages and views. By default, in Core 2.1 the UI comes in a prebuilt package, but you can easily scaffold it to view and change as you like: ASP.NET Core 2.2 – Scaffold Identity UI. I even did this over (ie, a 2nd time) over my first steps and it recognized all the previous code and just scaffolded nicely, no harm done.
At this point, any user can register, but they can not log in unless they confirm their email. Once they DO confirm their email, they can log in but have no access to any of the pages because they have yet to be assigned a role. The administrator must do this before they can continue.
Using the same method that I used to create the seed admin, I added a method to startup to seed the user roles:
// this is just to seed user roles, run once private async Task CreateUserRoles(IServiceProvider serviceProvider) { var RoleManager = serviceProvider.GetRequiredService<RoleManager<IdentityRole>>(); IdentityResult roleResult; var roleCheck = await RoleManager.RoleExistsAsync("SEOAdmin"); if (!roleCheck) { roleResult = await RoleManager.CreateAsync(new IdentityRole("SEOAdmin")); } roleCheck = await RoleManager.RoleExistsAsync("SEOManager"); if (!roleCheck) { roleResult = await RoleManager.CreateAsync(new IdentityRole("SEOManager")); } }
Then, added
CreateUserRoles(serviceProvider).Wait();
To Configure method in Startup.cs, ran once and the roles were set.
Then, I wanted something I could see existing users and decide their role before they got access. The end result looks like this:
To do this, I added a ManageUsersController and retrieved Users and Roles so that I could assign them a role. I created a the User Role View Model and User View Model to reflect the drop downs and added a corresponding view.
That was the last step to getting this little SEO backend together.
All packaged together, the entire project can all be seen on Github.
As noted, I work with Legacy and often have to bring in variables from the API that must be sustained across session.. (and I’m sure there might be a better way, comment and advise!). Where I am at now, is I query the API and bring in the variables, but how do I keep from calling these over and over? The old solution was session variables and so, that’s where I am at.
When I started to do this on Core, the most helpful article was this (and it’s in my comments):
https://adamstorr.azurewebsites.net/blog/are-you-registering-ihttpcontextaccessor-correctly
He leads you through the basic setup of a HttpContext helper class (that I still use today) and how to configure the startup.. Today, though, I came across a problem: I was able to Set session variables, but the Get was pulling null.
Order. Yes, you’ll see 1000 stackflow responses about order in Configure (and I was careful to do this in that method), but now in ConfigureServices (contrary to the example, as I am now using Core 2.2?), order again comes into play:
public void ConfigureServices(IServiceCollection services) { //this MUST be before add mvc or session returns null services.AddSingleton<IHttpContextAccessor, HttpContextAccessor>(); services.AddMvc();
How does the error present itself? Debugging looks great, queries to API fine, setting session (cookies) fine, result unexpected, but.. no errors. Trace your Get. My Session.GetString was pulling null.
Switch order in ConfigureServices and all was fine.
I’ve recently been curious about switching to a time API for my time stamps and removing any dependency the app might have on the server for a timestamp. Upon Googling I found some paid services, some free and of the free ones, I noticed one was hosted on Heroku. I’ve heard of Heroku, but never had a reason to attempt to use it. This was the perfect chance.
First, I created a free account on Heroku, nothing special. After verifying my email, I logged in to my Heroku Dashboard and up on the right hand corner, selected Create New App. I named it my company-api and out popped an app.
I decided on just plain, legacy PHP and a simple DateTime string passed thru JSON encode, just to get started. No authentication, no timezone, just a simple spit out if a request to the site came, like this:
<?php header('Content-type:application/json;charset=utf-8'); $DateTime = new DateTime(); $currentTime = $DateTime->format("Y-m-d H:i:s"); echo json_encode($currentTime); ?>
I created a Git repo for this brand new file and pushed it out. Then, I went back to Heroku, Dashboard, My App and Deploy. I selected Github as my deploy “resource” and selected the new repo I just made along with correpsonding branch.
I hit manual deploy and Heroku runs off to my GitHub repo, grabs the code, compiles and publishes.
It failed.
My first problem was that Heroku could not determine what language I had chosen for my app (you’d think the <?php would give it away …). You need either one of two things: a composer.json file or an index.php file (for legacy, like mine). I renamed my file to index.php and all I needed now was a “builder pack”.
To add a builder pack, I went back to Heroku, Dashboard, My App and Settings. Under builder pack, I added one for: “php”. Save settings and done.
I went back to Deploy, Manual Deploy and had a successful output. Yay! First Heroku app!
I want to make sure this API is receiving and sending JSON, so there’s a few IF’s I make the request go through before I hit logic on my PHP page. I also want to (lightly) secure the requests made to this API and monitor our usage of it for metrics information (and future investment). Since this itty bitty API is just relying on 1 index.php file, I figure this can be a sort of “router” for future API’s. So, this is what I added to the final PHP file:
The final, simple 1 page PHP Heroku API:
<?php header('Content-type:application/json;charset=utf-8'); //Make sure that it is a POST request. if(strcasecmp($_SERVER['REQUEST_METHOD'], 'POST') != 0){ throw new Exception('Request method must be POST!'); } //Make sure that the content type of the POST request has been set to application/json $contentType = isset($_SERVER["CONTENT_TYPE"]) ? trim($_SERVER["CONTENT_TYPE"]) : ''; $contentIsJson = strpos($contentType, "application/json"); if ($contentIsJson === false){ throw new Exception('Content type must be: application/json'); } //Receive the RAW post data. $content = trim(file_get_contents("php://input")); //Attempt to decode the incoming RAW post data from JSON. $decoded = json_decode($content, true); $app = strtoupper($decoded['API']); $key = $decoded['APIKEY']; //verify user key - simple MD5 generator: http://onlinemd5.com/. will build user management for keys if ever needed if ( $key == "BEAF1CB722A3F7758C7A7FA43F6BF2D1" ) { switch ($app) { case "TIME": $jsonString = getTime(); $arr = array('datetime' => $jsonString); break; default: $arr = array('error' => "Unknown Request On API"); break; } echo json_encode($arr); } //return the current time function getTime () { $DateTime = new DateTime(); //by default heroku returns time in UTC - can change in dashboard, config vars, only use as needed below //$DateTime->modify('-6 hours'); $currentTime = $DateTime->format("Y-m-d H:i:s"); $jsonString = $currentTime; return $jsonString; } ?>
I used Postman to send a raw JSON request to my Heroku app (used their default/free url). I wanted to make sure all my problems were resolved with this new toy, first, and then move on. Here’s what the raw request and response look like on Postman:
Heroku + my PHP are responding nicely!
So here’s how I did the same request and received the response in C# (I use dotnet Core):
public async Task<string> OurHerokuAPI() { string reqUrl = "https://mycompany-api.herokuapp.com"; using (var client = new HttpClient()) { client.DefaultRequestHeaders .Accept .Add(new MediaTypeWithQualityHeaderValue("application/json")); try { var query = new { APIKEY = "BEAF1CB722A3F7758C7A7FA43F6BF2D1", API = "time" }; var asJson = JsonConvert.SerializeObject(query); HttpResponseMessage response = await client.PostAsync(reqUrl, new StringContent(asJson, Encoding.UTF8, "application/json")); if (response.IsSuccessStatusCode) { var definition = new { datetime = string.Empty }; var json = JsonConvert.DeserializeAnonymousType(response.Content.ReadAsStringAsync().Result, definition); time = json.datetime; //monitors success across various "time" api's. in case this particular one fails, there can be various backups until success flag returns true. success = true; } } catch (OperationCanceledException) { //TODO: CREATE ERROR MESSAGE SEND BACK TO EMAIL/ERROR } } return time; }
If you see notes, yes, I decided to have some fun and create maybe two other functions, just like the one above, where I query some “free” time api’s until that “success” flag turns true. After x tries, I reluctantly call my last function, that gets the server timestamp. As said, I wanted my timestamp to be independent of server. So, if it hits this last method, I also send myself an error email that the time API is failing.
In the future, I could use environment variables (config vars) more wisely, instead of hardcoding. There’s also so much clean up to do, but this was a very fun intro into Heroku!
The idea behind this was to create a nice, easy UI that users can download media files they request often. We moved it to Azure to prevent killing our on-prem bandwidth, but then I had to deal with the flat file structure, etc. The end result was simply: a fast search of all the blobs (with link) and underneath that, a tree structure of the blobs that they can browse through.
First, I created a Storage container through Azure Portal, then I used Azure Storage Explorer to create a Blob Container under that storage account. I also set read-only permissions to my blobs by right clicking the container in Azure Storage Explorer, then: Set Container Public Access Level > Public Read Access for Blobs Only.
To make this code work, I needed to setup an environment variable for my connection string. I used the prefix CUSTOMCONNSTR_ on my variable name as it comes in handy when deploying to Azure Web Apps. To get the connection string: Azure Portal > Storage Account you created > Access Keys.
setx CUSTOMCONNSTR_storageConnectionString "<yourconnectionstring>"
Finally, I got a folder I wanted to share and dragged and dropped it into my Container using Azure Storage Explorer.
I used .NET Core to query the container and list the blobs, with segments. I then looped through that list, creating a formatted JSON string that I could feed to zTree (I preferred the Font-Awesome styling shown here). I also simultaneously created a list of the blobs into formatted JSON that I could also feed to Ajax for a quick search.
Sharing Azure Blobs on Azure Web Services
When deploying to Azure Web App Services: Web App > Application Settings > Connection Strings, name is: storageConnectionString, value: string copied from container assets, type: Custom