Jump to content

AlexHeylin

Members
  • Content Count

    22
  • Joined

  • Last visited

Community Reputation

0 Neutral

My Information

  • Agent Count
    500+

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I have PowerShell code that used to work calling RMM+ and has stopped. It just sends a message to the customer machine. Now I get "Session not in specified group, or you do not have permission to perform this operation on it.". Permissions seem OK (and used to work before we upgraded SC and switched to HTTPS). It looks like SC requires a "correct origin header" https://docs.connectwise.com/ConnectWise_Control_Documentation/Developers/External_API_calls_to_ConnectWise_Control but I don't see one set in the RMM+ code. I've tried passing the correct origin header from PoSh to RMM+ but that doesn't work. Given some of the other posts about SC being breached via API key without MFA, is there a more secure way to do this anyway? I need to be able to send messages and run commands on the remote machines. Thanks
  2. Want to bet? They've lied to me publicly.
  3. FWIW - this has been formally rejected, even though half the bugs and enhancements remain undone. https://product.connectwise.com/communities/5/topics/12903-open-source-the-linux-macos-agents
  4. The maintenancemode table has been replaced with a view which emulates the data the maintenancemode table used to contain by building it from the new maintenancemodequeue table. However CW screwed up the SQL because instead of using a stock MySQL function they tried to calculate some time / datetime values themselves. With long-lasting maintenance modes this can overflow the maximum value for a time datatype which causes the SQL to throw warnings and produce incorrect output. Here's a fix - submitted to CW as Ticket#13557433 on 2020-070-02 You'll likely need to replace this after every upgrade / patch until CW get around to fixing the code in their releases. I wouldn't hold your breath - last time it took two years for them to replace their broken SQL with the fix I gave them. Note - I have not confirmed that the output of this table matches what would have existed in the old maintenancemode table under all conditions. Instead I have focused on just fixing the calculations CW used. To use either directly run / import the direct-import file below, or open the view in your SQL editor and paste the view-content file into the view and click save. maintenancemode-fixed-2020-07-02-direct-import.sql maintenancemode-fixed-2020-07-02-view-content.sql
  5. +1 for Grafana and direct MySQL queries of the DB. You'd better get to know the DB and MySQL quite well before trying to hook Grafana up to it. Prepare a stack of queries which expose the metrics you're after. Get comfortable with SQL queries and particularly with returning time-bound and ORDERed results , otherwise you'll likely find trying to bolt Grafana on quite painful and frustrating. Grafana expects the data supplied in a certain way, and if you're not comfortable with MySQL (or even if you are!) this can be a bit confusing. One of the biggest challenges many new Grafana users find is trying to get their head around the idea that most data supplied to Grafana must be a time series and sometimes it needs to be supplied in a specific order - "ORDER BY `time` ASC,". You might want a gauge, and expect to supply just a single result with a value in it - but Grafana might expect you to supply the underlying values and let it work it out itself. Once you get your head around both, you can have a lot of "fun" visualising the data. One of my favourites I built shows the temperature of each client's server room, calculated from all the servers in the room, with a sparkline showing trend since midnight (limitation of the way LT DB stores the data). There's a number of Grafana dashboards for CWA / Labtech (search for both) on Github etc. I also recommend https://grafana.com/grafana/dashboards/7991 as a good way to keep an eye on how the LT MySQL DB is running. Actually this was one of the easiest to install - so I'd start with this before trying to import anything else. Once you've got this up and running and looking sexy (black transparent theme please!) - it should spur you on to tackle the more challenging ones. I'd take a look at how this uses Variables Heads up - when importing dashboards written on other systems it's quite common for the datasource names to not get updated to point to your data source name (for example theirs was called "Labtech", yours is called "MySql-CWA"). If this happens - nothing will work. One way around this is to grab the JSON of the dashboard and open it in your favourite editor and find all the datasource names and change them to match yours. Then paste / upload your updated version into Grafana. I've not found a reliable way to fix a broken dashboard after upload, so delete it, fix the JSON, and upload again.
  6. If anyone is interested in seeing this VBR plugin continue to live, please support my request for it to be open sourced. https://forums.veeam.com/rmm-tools-f35/open-source-labtech-plugin-for-vbr-veb-not-vac-t65163.html The VAC plugin is not a suitable replacement as only seems part of our estate making it useless. We can't change this without significant monthly cost just to try and reproduce what this plugin already does.
  7. If anyone is interested in seeing this VBR plugin continue to live, please support my request for it to be open sourced. https://forums.veeam.com/rmm-tools-f35/open-source-labtech-plugin-for-vbr-veb-not-vac-t65163.html The VAC plugin is not a suitable replacement as only seems part of our estate making it useless. We can't change this without significant monthly cost just to try and reproduce what this plugin already does. For the record VAC is massively over-sold by Veeam, it doesn't do half the things is is supposed to (like "aggregates data across Veeam products" - only partly true), doesn't do the license reporting they told us it does, and isn't free to use in all cases. Even the invoicing & reporting that is there full of quirks like they wrote it without any reference to the actual products & license types. It might be good in a couple of versions, but right now we can't trust it as it's just wrong a lot of the time - like not invoicing customers for the correct amounts, and reports that say there's no data even though another screen shows the data that's "not available".
  8. If anyone is interested in seeing this VBR plugin continue to live, please support my request for it to be open sourced. https://forums.veeam.com/rmm-tools-f35/open-source-labtech-plugin-for-vbr-veb-not-vac-t65163.html The VAC plugin is not a suitable replacement as only seems part of our estate making it useless. We can't change this without significant monthly cost just to try and reproduce what this plugin already does. For the record VAC is massively over-sold by Veeam, it doesn't do half the things is is supposed to (like "aggregates data across Veeam products" - only partly true), doesn't do the license reporting they told us it does, and isn't free to use in all cases. Even the invoicing & reporting that is there full of quirks like they wrote it without any reference to the actual products & license types. It might be good in a couple of versions, but right now we can't trust it as it's just wrong a lot of the time - like not invoicing customers for the correct amounts, and reports that say there's no data even though another screen shows the data that's "not available".
  9. You expected something else? Have you dealt with CW before? 🙄
  10. My fix: Reinstall Windows. Brutal but sometimes the only option.
  11. For a more comprehensive version, and to vote on the enhancement go here https://product.connectwise.com/communities/5/topics/10767-linux-agent-to-recover-from-service-crashes
  12. Thanks Darren! Don't understand why LT seem to have made a load of breaking changes, which your script then reverses out to make the files link properly.
  13. I raised this with support for exactly this reason and they said LT's WMI won't use alternate namespace. Have to use PowerShell to shim it
  14. We'd like to have our production LT DB copy to a data warehouse DB on seperate server so we can keep lots of historical data in the warehouse DB but keep the production DB light and fast. We want to be able to connect LT CC to warehouse DB to run reports etc, so we'll need the DB to stay current with things like computers, and associated records - but the warehouse DB should also have eventlogs and other data that's been purged from the prod DB. For example - prod DB would only have full eventlogs going back 3 day, partial eventlogs going back 10 days (can do this already) - bu we want the warehouse DB to have all eventlogs going back at least 90 days. Has anyone done / seen anything about an install like this? Any gotchas to watch out for if I I just do: prod DB -> copy / update (don't delete) line by line in each table -> warehouse DB using something like Percona toolkit? Thanks for any advice!
×
×
  • Create New...