Jump to content
NAKIVO Community Forum

Leaderboard

Popular Content

Showing content with the highest reputation since 08/19/19 in all areas

  1. Hello, community, let's gather here general and more scecific tips on how to speed up backups. Any suggestions and comments will be helpful. Thank you
    3 points
  2. Hello, @Loki Rodriguez, great idea! From my side I can recommend reading this worthwhile blog posts and references: How to Increase VM Backup Speed Increasing VM Backup and Replication Speed with Network Acceleration Network Acceleration Best Practices for Efficient Hyper-V VM Backup Advanced Bandwidth Throttling VMware Backup: Top 10 Best Practices Hyper-V Best Practices for Administration An Overview of Microsoft Office 365 Backup Best Practices
    3 points
  3. When I created this thread, I thought we would be mostly speaking about network load and its critical impact on backups. (Update: Yes, I have found this information if one of the references by @Official Moderator) Reducing the network load is the first I would do in a situation when you want to increase backup server performance.
    2 points
  4. Hello, @Hakim, I think you should find out more information about NAKIVO Backup for Ransomware Protection: https://www.nakivo.com/ransomware-protection/ Anti-ransomware software alone is not sufficient to protect your organization's data. Also I recommend reading best practices for ransomware protection and recovery: https://www.nakivo.com/ransomware-protection/white-paper/best-practices-for-ransomware-protection-and-recovery/ Sorry for the offtop, I just wanted to support your remark
    2 points
  5. Hello, guys two remarks from me - on one hand, anti-virus solutions can slow down saving and reading data from a disk, but on the other hand, ransomware issues can cause performance degradation. So you should always keep the balance
    2 points
  6. Hello, @tommy.cash, thank you for your post. Here is the list of the possible questions you should ask yourself before creating the Backup Policy of your University: 1) Frequency of system/application backups. 2) Frequency of network file backups. 3) Frequency of email backups. 4) Frequency of desktop backups. 5) Storage. 6) Recovery Testing. About Microsoft 365 Backup Policies - there are inbuilt configuration options in your account which can help to prevent data loss. Due to Microsoft's Shared Responsibility Model, it is necessary also to have a backup of your Microsoft 365 data, using a third-party solution as NAKIVO Backup & Replication. Please, check a useful how-to guide on setting up Microsoft 365 backup policies: https://www.nakivo.com/blog/setting-up-microsoft-office-365-backup-policies/
    2 points
  7. Hi. When is it planned Vmware 7 update 3 support?
    2 points
  8. Hi Nakivo Is it possible to add MFA option for the Webinterface. I have some customers, where MFA is required for the backup. Thanks Mario
    2 points
  9. @Bedders, Please try replacing "q" with "-q": zip -q -d log4j-core*.jar org/apache/logging/log4j/core/lookup/JndiLookup.class Let me know if it works for you!
    2 points
  10. Hi, @JurajZ and @Bedders! NAKIVO Backup & Replication is using the Apache Log4j library, which is part of Apache Logging Services. You can manually fix the CVE-2021-44228 vulnerability by removing JndiLookup.class located in libs\log4j-core-2.2.jar. Note: If the libs folder contains log4j-core-fixed-2.2.jar instead of log4j-core-2.2.jar, it means that the issue was already fixed for your version of NAKIVO Backup & Replication. For Linux: Go to the libs folder located inside NAKIVO Backup & Replication installation folder. To remove JndiLookup.class from the jar file run the following command: zip -q -d log4j-core*.jar org/apache/logging/log4j/core/lookup/JndiLookup.class For Windows: Ensure you have 7z tool installed. Go to the libs folder located inside NAKIVO Backup & Replication installation folder. Use 7z to open the log4j-core-2.2.jar and remove JndiLookup.class from the jar file. Restart NAKIVO Backup & Replication. For NAS devices: If you are using a NAS, open an SSH connection to your device and locate NAKIVO Backup & Replication installation folder here: For ASUSTOR NAS: /usr/local/AppCentral/NBR For FreeNAS/TrueNAS (inside the jail): /usr/local/nakivo/director For NETGEAR NAS: /apps/nbr For QNAP NAS: /share/CACHEDEV1_DATA/.qpkg/NBR For Raspberry PI: /opt/nakivo/director For Synology NAS: /volume1/@appstore/NBR For Western Digital NAS: /mnt/HD/HD_a2/Nas_Prog/NBR Note: Refer to the NAS vendor documentation to learn how to open an SSH connection to your NAS device. IMPORTANT: CVE-2021-44228 is a severe vulnerability. We strongly advise you to apply the manual fix as soon as you can. This is the best way to avoid the risks of security breaches. Please contact customer support if you require custom build of NAKIVO Backup & Replication that has the fix.
    2 points
  11. Hi, @SALEEL! I am very sorry that we put you in such a position. The latest improvements in DSM7 were unexpected for everyone. And it makes us feel very sad that we can't provide you with our best service as usual. Our team was doing everything to accelerate the process. However, at this point, there is nothing we can do. I understand your disappointment, and I apologize for all the inconvenience it has caused you. One of our main values is customers satisfaction. That's why this situation is frustrating. If only it depended on us! At the moment I can only offer you the following options: 1. You can temporarily install NAKIVO Backup & Replication on some other host and use the NAS-based repository as a CIFS/NFS share. When DSM7 is officially supported, then it will be possible to install NAKIVO on NAS again. 2. Workaround for DSM 7: - Deploying a new NAKIVO installation somewhere else as Virtual Appliance (VA) or installing on a supported VM, Physical or NAS; - Share the repository folder on the Synology NAS with DSM7 as NFS / CIFS (SMB) share; - Add this repository as existing to the new NAKIVO installation; - Restore the configuration of old NAKIVO installation from Self-backup from the original repository. For now, we are working closely with Synology to prepare the release, and we are almost at the final stage. However, we expect that we will need a few more weeks till the final release. Again, I am very sorry for all the convenience!
    2 points
  12. Great to get 2FA for the Login Just found the option Store backups in separate files: Select this option to enable this backup repository to store data of every machine in separate backup files. Enabling this option is highly recommended to ensure higher reliability and performance. Why this is more reliable? Does this have an effect on the global dedup? And can I convert the curent backup to this option? From my point of view, the Repos are the current bottleneck of Nakivo. if there is an improvement with this, it is very welcome!
    2 points
  13. Thank you so much for your help. It worked for me using win SCP.
    2 points
  14. Here is how to fix it... for QNAP Login to QNAP via SSH admin cd /share/CACHEDEV1_DATA/.qpkg/NBR/transporter # ./nkv-bhsvc stop Stopping NAKIVO Backup & Replication Transporter service: [/share/CACHEDEV1_DATA/.qpkg/NBR/transporter] # ./bhsvc -b "<UsePassword>" [/share/CACHEDEV1_DATA/.qpkg/NBR/transporter] # ./nkv-bhsvc start Starting NAKIVO Backup & Replication Transporter service: Use Crome Browser https://<qnapip>:4443 Go to Settings / Transporters / Onboard transporter / Edit Press F12 -> go to Console and type following JavaScript codes (Paste one by one command and Press Enter): var a=Ext.ComponentQuery.query("transporterEditView")[0] a.masterPasswordContainer.show() a.connectBtn.show() you get new fields in the Browser: - Master password: - Connect Button In Master password: <ThePassworfromTheSSHCommand> (Same as used here in the bhsvc -b "<UsePassword>") Press Connect Button Then Refresh Transporter an Repository by Boris
    2 points
  15. Anyone else have problems with Grand-Father-Son (GFS) retention scheme not working as expected? Daily backups work correctly and the retention is correct, but instead of getting weekly and Monthly backups, all my Full backups are set to yearly week after week at both my sites where I have tape libraries. (They expire 11 years from now) I opened a ticket, but they were not able to tell me anything and claimed that everything was working fine. At the time I was not doing daily backups, so I turned that on and they work, but they didn't affect my problem with yearly backups, so for now I'm turning it off to see what happens with just weekly and monthly backups. These are my settings: Retain one recovery point per day for 7 days Retain one recovery point per week for 4 weeks Retain one recovery point per month for 12 months Retain one recovery point per year for 11 years Tape Job Options: Create Full Backup: Every Saturday Tape appending: Start full backup with an empty tape** **I found that now that daily backups are turned on, Nakivo will start writing daily backups to my Full Backup tape before I get a chance to eject it. This is not desirable, but there is no options to segregate GFS tapes. Setting this to "Always start with an empty tape" would burn a tape each day, also not desirable, but I may have no choice. I would like to append all daily backups to the same tape and keep it in the library and only offsite my weekend full backups.
    2 points
  16. In a multi-tenant environment, if vpn drops for any reasons (isp problem, customer's ac problem, etc) the sheduled jobs will start in time or they will wait director command to start? thank's and best regards
    2 points
  17. The shortest way to integrate this without going vendor specific, is adding SAML 2.0. This would allow logins based on Microsoft 365 or other MFA vendors.
    2 points
  18. 2 points
  19. Could you please do a video for Nakivo B&R to Microsoft Azure ?
    2 points
  20. It is something that i may be interested in but i would like to see a "how to" video on the process on your youtube. I saw that you have a "how to" video on amazon ec2 backups (which is old now by the way)
    2 points
  21. Thank you. Someone from Nakivo Support helped me already. This is what I did: to increase java memory on NAKIVO application as following : 1. Stop NAKIVO service from the AppCenter 2. Connect to Synology via SSH using "root" 3. edit nkv-dirsvc file ( mostly that file is located in /volume1/@appstore/NBR/nkv-dirsvc ) and replace: SVC_OPTIONS="-Dfile.encoding=UTF-8 -server -XX:+AggressiveOpts -XX:-OmitStackTraceInFastThrow -Xnoagent -Xss320k -Xms128m -Xmx320m" with: SVC_OPTIONS="-Dfile.encoding=UTF-8 -server -XX:+AggressiveOpts -XX:-OmitStackTraceInFastThrow -Xnoagent -Xss320k -Xms128m -Xmx640m" 4. Start NAKIVO service Thank you. Juraj.
    2 points
  22. Hello, thanks for support, it works. Regards Francesco
    2 points
  23. Our team is excited to announce the official release of NAKIVO Backup & Replication v9.1! The most anticipated features are finally available: Backup to Tape, Linux Server Backup, Instant Verification and Windows Workstation Backup. The fully-functional Free Trial version is here https://www.nakivo.com/resources/releases/v9.1/
    2 points
  24. NAKIVO Backup & Replication v9.0 allows you to perform an incremental, application-aware backup of Windows Servers. With our solution, you can instantly recover files as well as application objects! Download our full-featured free trial to try out NAKIVO Backup & Replication for yourself: https://www.nakivo.com/resources/releases/v9.0/
    2 points
  25. @Loki RodriguezYes, thank you for the update. You can find the information in my references list. Also, I recommend paying particular attention to the Network Acceleration feature, which is extremely useful in speeding up VM backup and replication jobs.
    1 point
  26. Did anyone try the combination yet? DSM7 and Nakivo 10.5 beta?
    1 point
  27. I thought this was all good to go now. I just updated Nakivo to 10.5 on a Synology nas, then updated the nas to DSM 7.1 update 1. Now it says Nakivo Backup and Replication Stopped. The package version is incompatible with your DSM. And the nakivo website says You can download the package for DSM 7 only from the Synology Package Center. But there's no option to update Nakivo from the Package center.
    1 point
  28. Please add Nakivo Transporter 10.6 to the Synology App Store
    1 point
  29. Hello, Thanks for the clarification. So when do you anticipate Synology will have the signed package deployed? Is there any hotfix or workaround for this?
    1 point
  30. Hi all, I'm hoping someone can help me. I'm looking for an older build of NAKIVO Backup & Recovery, specifically 10.2.0 (build 51253), for Synology NAS (so the SPK file). Basically we're having an issue with a VM backup and support are advising we upgrade to 10.5.1 Every time historically we've updated there have been issues, so I would like our current version to have on hand in case I need to revert to a known "almost working" configuration. I know this is a long shot! Thanks in advance.
    1 point
  31. That's great. It's still marked as unsupported in Synology's DSM7 3rd party packages support matrix, but found same topic on their community site confirming that NAKIVO 10.5 is supported by DSM7. Already moved to DSM7 with NBR10.5 Thanks
    1 point
  32. Today we tested 10.5 beta version and still having "The VM cannot be processed. cannot find field: vmOpNotificationToAppEnabled (NoSuchFieldException)" issue, no backup at all.
    1 point
  33. Cool Idea! Why temporary? I like docker and to be independent of OS restrictions. Thank you. I'll give it a try
    1 point
  34. Correct, you didn't. In 7.0U2 they [VMWare] uprated/hardened the security requirements through ssh. By doing this change you've reverted that change by VMWare.
    1 point
  35. Our space reclaim we run each week, but never get past about 57-58% before stopping for weekend backups and copies offsite, then starting again the next week. Could we administratively allocate more processing resources for the space reclaim to get further at least once, then return them to default? This is running on a Synology NAS.
    1 point
  36. @Official Moderator just messaged; I had not seen your response here requesting that information
    1 point
  37. Hello, I'd like to inquire what the best way is to set up backup to cloud - Amazon S3. I don't have much Amazon S3 experience but would like to keep the backups fast & the cost low, and I'm not sure how to proceed. * If I set up backup with incrementals & weekly full backups, my S3 costs will surely be low because I don't download anything. However, my weekend backups will be fulls and take considerable time. Correct? * If I set up backup with incrementals & synthetic fulls, does that work well with S3? Will my backups be faster or will S3 use my local transporters to merge the backups and create high costs due to downloading from S3? * In a local test (not to S3) I saw that backup copy jobs ignore the synthetic setting and revert to full backups - in normal backup jobs it works. Is this true for S3 also? I'd love to read some advice from people running backups to S3 at this moment so I make the right choice? Thanks!
    1 point
  38. Problem ix fixed via remote connection from the great support Team... thx for the fast fix
    1 point
  39. Pls update you page https://www.nakivo.com/vmware-backup/ on this - 7u1.
    1 point
  40. Hi, thank you very much for the information. It is unfortunate that it is not possible in our version But thanks anyway for the quick reply. Kind regards, Jens
    1 point
  41. Model is an RS2418+ and I currently have the full 10.2 installed.
    1 point
  42. This is absolutely crucial these days. Also, filtering web access based on IP subnet, if possible (I know, we can also do that in the firewall...)
    1 point
  43. Hi Mod, Thanks for the info--just to clarify I did also follow your linked steps previously. Using my Transporter with Automatic settings, I deployed a new test Nano instance with a small (8g) disk into the same VPC as the Real target. That Test target *does* back up (using your EC2 job steps). However another Test Nano with a large (100g) disk fails with an error. I transmitted a support request, but haven't seen a reply yet. The Real target instance has a 500g data volume mounted to it, so I'm very interested in knowing if size is a limiting factor. My next trials will be to add a volume to my Test instance and see if there's some point where I can't fetch it.
    1 point
  44. there are 3 options for overwrite behavior. 1. rename if item exist 2. skip if item exist 3. overwrite the original if item exist can we request for few more possible options? 1. skip files older then specific date and also fix the "skip if item exisit" as the following option only compare with the folder name, it does not goes into the files if they exist. there should be an option to choose if skip folder if exist or skip files items if files items exist. that would be helpful for us to do recovery. thanks
    1 point
  45. The release notes for Nakivo 9.4 have the following item: Removed support for Microsoft Exchange 2010 can you provide additional information about the change? Can we back up exchange 2010 and restore items?
    1 point
  46. 1 point
  47. We will forward your Feature Request to our Product Development team for possible further implementation. Thank you for helping us to make our product better!
    1 point
  48. Thanks a lot for your reply! 1) Full in both. 2) Yes 3) Perhaps i did the config wrong on site A then. * Assigned Transporter: The one from site B, correct? * Location I have the following options: (1) Local folder on remote transporter (2) remote cifs folder (3) remote nfs folder. Which option is the correct one if transporter on site A doesn't have direct access to the NAS on site B? 4) That works fine, that's how backup is working on site B The reason why i didn't open a support request with NAKIVO is that i am using the free version. Should I do it? Thank you for your time!
    1 point
  49. Please check this link for more details https://helpcenter.nakivo.com/display/KB/Integrating+with+Azure+Cloud
    1 point
×
×
  • Create New...