Jump to content
NAKIVO Community Forum

Leaderboard

Popular Content

Showing content with the highest reputation since 08/19/19 in all areas

  1. I've just uploaded almost 20GB of very important data on OneDrive. Documents, xlsx, pdfs mostly... Now I wonder about OneDrive's security and reliability. What is your point of view? Should I keep one more copy in other place?
    3 points
  2. Hello, community, let's gather here general and more scecific tips on how to speed up backups. Any suggestions and comments will be helpful. Thank you
    3 points
  3. Hello, @Loki Rodriguez, great idea! From my side I can recommend reading this worthwhile blog posts and references: How to Increase VM Backup Speed Increasing VM Backup and Replication Speed with Network Acceleration Network Acceleration Best Practices for Efficient Hyper-V VM Backup Advanced Bandwidth Throttling VMware Backup: Top 10 Best Practices Hyper-V Best Practices for Administration An Overview of Microsoft Office 365 Backup Best Practices
    3 points
  4. Hi Nakivo Is it possible to add MFA option for the Webinterface. I have some customers, where MFA is required for the backup. Thanks Mario
    2 points
  5. It is not available in the PRO license!
    2 points
  6. @Official Moderator Nice tips. One more from me - use two-factor verification. It is a must
    2 points
  7. Hello, @Tayler Odling, of course if your data is important for you, keep it in multiple places, I recommend to have at least three copies of it, save it to two different types of media and at least one copy offsite. About Microsoft OneDrive, my point of view it is as safe as any other storage, but you should not rely only on one spot, though it provides encryption. Please, find great tips on OneDrive data safety and a list of the common mistakes that can influence your cloud security here: https://www.nakivo.com/blog/microsoft-onedrive-security/ Let me know if you need any additional assistance from me. I look forward to hearing from you, Tayler.
    2 points
  8. When I created this thread, I thought we would be mostly speaking about network load and its critical impact on backups. (Update: Yes, I have found this information if one of the references by @Official Moderator) Reducing the network load is the first I would do in a situation when you want to increase backup server performance.
    2 points
  9. Hello, @Hakim, I think you should find out more information about NAKIVO Backup for Ransomware Protection: https://www.nakivo.com/ransomware-protection/ Anti-ransomware software alone is not sufficient to protect your organization's data. Also I recommend reading best practices for ransomware protection and recovery: https://www.nakivo.com/ransomware-protection/white-paper/best-practices-for-ransomware-protection-and-recovery/ Sorry for the offtop, I just wanted to support your remark
    2 points
  10. Hello, guys two remarks from me - on one hand, anti-virus solutions can slow down saving and reading data from a disk, but on the other hand, ransomware issues can cause performance degradation. So you should always keep the balance
    2 points
  11. Hello, @tommy.cash, thank you for your post. Here is the list of the possible questions you should ask yourself before creating the Backup Policy of your University: 1) Frequency of system/application backups. 2) Frequency of network file backups. 3) Frequency of email backups. 4) Frequency of desktop backups. 5) Storage. 6) Recovery Testing. About Microsoft 365 Backup Policies - there are inbuilt configuration options in your account which can help to prevent data loss. Due to Microsoft's Shared Responsibility Model, it is necessary also to have a backup of your Microsoft 365 data, using a third-party solution as NAKIVO Backup & Replication. Please, check a useful how-to guide on setting up Microsoft 365 backup policies: https://www.nakivo.com/blog/setting-up-microsoft-office-365-backup-policies/
    2 points
  12. @Bedders, Please try replacing "q" with "-q": zip -q -d log4j-core*.jar org/apache/logging/log4j/core/lookup/JndiLookup.class Let me know if it works for you!
    2 points
  13. Hi, @JurajZ and @Bedders! NAKIVO Backup & Replication is using the Apache Log4j library, which is part of Apache Logging Services. You can manually fix the CVE-2021-44228 vulnerability by removing JndiLookup.class located in libs\log4j-core-2.2.jar. Note: If the libs folder contains log4j-core-fixed-2.2.jar instead of log4j-core-2.2.jar, it means that the issue was already fixed for your version of NAKIVO Backup & Replication. For Linux: Go to the libs folder located inside NAKIVO Backup & Replication installation folder. To remove JndiLookup.class from the jar file run the following command: zip -q -d log4j-core*.jar org/apache/logging/log4j/core/lookup/JndiLookup.class For Windows: Ensure you have 7z tool installed. Go to the libs folder located inside NAKIVO Backup & Replication installation folder. Use 7z to open the log4j-core-2.2.jar and remove JndiLookup.class from the jar file. Restart NAKIVO Backup & Replication. For NAS devices: If you are using a NAS, open an SSH connection to your device and locate NAKIVO Backup & Replication installation folder here: For ASUSTOR NAS: /usr/local/AppCentral/NBR For FreeNAS/TrueNAS (inside the jail): /usr/local/nakivo/director For NETGEAR NAS: /apps/nbr For QNAP NAS: /share/CACHEDEV1_DATA/.qpkg/NBR For Raspberry PI: /opt/nakivo/director For Synology NAS: /volume1/@appstore/NBR For Western Digital NAS: /mnt/HD/HD_a2/Nas_Prog/NBR Note: Refer to the NAS vendor documentation to learn how to open an SSH connection to your NAS device. IMPORTANT: CVE-2021-44228 is a severe vulnerability. We strongly advise you to apply the manual fix as soon as you can. This is the best way to avoid the risks of security breaches. Please contact customer support if you require custom build of NAKIVO Backup & Replication that has the fix.
    2 points
  14. Hi, @SALEEL! I am very sorry that we put you in such a position. The latest improvements in DSM7 were unexpected for everyone. And it makes us feel very sad that we can't provide you with our best service as usual. Our team was doing everything to accelerate the process. However, at this point, there is nothing we can do. I understand your disappointment, and I apologize for all the inconvenience it has caused you. One of our main values is customers satisfaction. That's why this situation is frustrating. If only it depended on us! At the moment I can only offer you the following options: 1. You can temporarily install NAKIVO Backup & Replication on some other host and use the NAS-based repository as a CIFS/NFS share. When DSM7 is officially supported, then it will be possible to install NAKIVO on NAS again. 2. Workaround for DSM 7: - Deploying a new NAKIVO installation somewhere else as Virtual Appliance (VA) or installing on a supported VM, Physical or NAS; - Share the repository folder on the Synology NAS with DSM7 as NFS / CIFS (SMB) share; - Add this repository as existing to the new NAKIVO installation; - Restore the configuration of old NAKIVO installation from Self-backup from the original repository. For now, we are working closely with Synology to prepare the release, and we are almost at the final stage. However, we expect that we will need a few more weeks till the final release. Again, I am very sorry for all the convenience!
    2 points
  15. Hi. When is it planned Vmware 7 update 3 support?
    2 points
  16. Great to get 2FA for the Login Just found the option Store backups in separate files: Select this option to enable this backup repository to store data of every machine in separate backup files. Enabling this option is highly recommended to ensure higher reliability and performance. Why this is more reliable? Does this have an effect on the global dedup? And can I convert the curent backup to this option? From my point of view, the Repos are the current bottleneck of Nakivo. if there is an improvement with this, it is very welcome!
    2 points
  17. Thank you so much for your help. It worked for me using win SCP.
    2 points
  18. Here is how to fix it... for QNAP Login to QNAP via SSH admin cd /share/CACHEDEV1_DATA/.qpkg/NBR/transporter # ./nkv-bhsvc stop Stopping NAKIVO Backup & Replication Transporter service: [/share/CACHEDEV1_DATA/.qpkg/NBR/transporter] # ./bhsvc -b "<UsePassword>" [/share/CACHEDEV1_DATA/.qpkg/NBR/transporter] # ./nkv-bhsvc start Starting NAKIVO Backup & Replication Transporter service: Use Crome Browser https://<qnapip>:4443 Go to Settings / Transporters / Onboard transporter / Edit Press F12 -> go to Console and type following JavaScript codes (Paste one by one command and Press Enter): var a=Ext.ComponentQuery.query("transporterEditView")[0] a.masterPasswordContainer.show() a.connectBtn.show() you get new fields in the Browser: - Master password: - Connect Button In Master password: <ThePassworfromTheSSHCommand> (Same as used here in the bhsvc -b "<UsePassword>") Press Connect Button Then Refresh Transporter an Repository by Boris
    2 points
  19. Anyone else have problems with Grand-Father-Son (GFS) retention scheme not working as expected? Daily backups work correctly and the retention is correct, but instead of getting weekly and Monthly backups, all my Full backups are set to yearly week after week at both my sites where I have tape libraries. (They expire 11 years from now) I opened a ticket, but they were not able to tell me anything and claimed that everything was working fine. At the time I was not doing daily backups, so I turned that on and they work, but they didn't affect my problem with yearly backups, so for now I'm turning it off to see what happens with just weekly and monthly backups. These are my settings: Retain one recovery point per day for 7 days Retain one recovery point per week for 4 weeks Retain one recovery point per month for 12 months Retain one recovery point per year for 11 years Tape Job Options: Create Full Backup: Every Saturday Tape appending: Start full backup with an empty tape** **I found that now that daily backups are turned on, Nakivo will start writing daily backups to my Full Backup tape before I get a chance to eject it. This is not desirable, but there is no options to segregate GFS tapes. Setting this to "Always start with an empty tape" would burn a tape each day, also not desirable, but I may have no choice. I would like to append all daily backups to the same tape and keep it in the library and only offsite my weekend full backups.
    2 points
  20. In a multi-tenant environment, if vpn drops for any reasons (isp problem, customer's ac problem, etc) the sheduled jobs will start in time or they will wait director command to start? thank's and best regards
    2 points
  21. The shortest way to integrate this without going vendor specific, is adding SAML 2.0. This would allow logins based on Microsoft 365 or other MFA vendors.
    2 points
  22. 2 points
  23. Could you please do a video for Nakivo B&R to Microsoft Azure ?
    2 points
  24. It is something that i may be interested in but i would like to see a "how to" video on the process on your youtube. I saw that you have a "how to" video on amazon ec2 backups (which is old now by the way)
    2 points
  25. Thank you. Someone from Nakivo Support helped me already. This is what I did: to increase java memory on NAKIVO application as following : 1. Stop NAKIVO service from the AppCenter 2. Connect to Synology via SSH using "root" 3. edit nkv-dirsvc file ( mostly that file is located in /volume1/@appstore/NBR/nkv-dirsvc ) and replace: SVC_OPTIONS="-Dfile.encoding=UTF-8 -server -XX:+AggressiveOpts -XX:-OmitStackTraceInFastThrow -Xnoagent -Xss320k -Xms128m -Xmx320m" with: SVC_OPTIONS="-Dfile.encoding=UTF-8 -server -XX:+AggressiveOpts -XX:-OmitStackTraceInFastThrow -Xnoagent -Xss320k -Xms128m -Xmx640m" 4. Start NAKIVO service Thank you. Juraj.
    2 points
  26. Hello, thanks for support, it works. Regards Francesco
    2 points
  27. Our team is excited to announce the official release of NAKIVO Backup & Replication v9.1! The most anticipated features are finally available: Backup to Tape, Linux Server Backup, Instant Verification and Windows Workstation Backup. The fully-functional Free Trial version is here https://www.nakivo.com/resources/releases/v9.1/
    2 points
  28. NAKIVO Backup & Replication v9.0 allows you to perform an incremental, application-aware backup of Windows Servers. With our solution, you can instantly recover files as well as application objects! Download our full-featured free trial to try out NAKIVO Backup & Replication for yourself: https://www.nakivo.com/resources/releases/v9.0/
    2 points
  29. Is 10.6.1.67016-3004 of Nakivo fully supported with 7.1-42661 Update 4? It shows in the package center as showing an update after I updated my Synology NAS.
    1 point
  30. Sorry for a too general topic, but can you give some advices on how to backup VMware virtual machines. I am a beginner and I need more helpful tips. Thanx
    1 point
  31. 1 point
  32. Great and helpful blog post. Thank you
    1 point
  33. @Official Moderator will there be an alternative option to drop down select for s3 compatible? so we can key in the s3 compatible object storage address and other information without us having to use only wasabi and AWS as the only option?
    1 point
  34. @Loki RodriguezYes, thank you for the update. You can find the information in my references list. Also, I recommend paying particular attention to the Network Acceleration feature, which is extremely useful in speeding up VM backup and replication jobs.
    1 point
  35. 1 point
  36. Hi, detach/attach backup repository seeams a good solution to preserve data corruption. Why do you not add this feature directly in backup job or backup copy job ?
    1 point
  37. Hi. Before I update to DSM 7.0.1-42218 Update 2 I was wondering if it will cause issues with Nakivo 10.4.0 (build 56979 from 30 Jul 2021). Do I need to update Nakivo first?
    1 point
  38. @Yosmar, Hi! To fix the VM consolidation needed status, do the following: Go to VMware vSphere Client. Click the VM name > Snapshots > Consolidate. You will see a confirmation message: "This operation consolidates all redundant redo logs on your virtual machine. Are you sure you want to continue?" Click Yes to consolidate VM disk files. Note: If you are worried that you can lose your data during disk consolidation, back up your VMware VM. Wait until the task is completed. You can see the progress at the bottom of your VMware vSphere Client. If the VM is running, VM performance can degrade during this operation. The needed time depends on the virtual machine size, the number of snapshots and your VM load. After the VMware disk consolidation is finished, the error message will disappear. However, it might happen that you'll get another error message. Check the guide on how to solve the potential errors in this blog post https://www.nakivo.com/blog/fix-vmware-error-virtual-machine-disks-consolidation-needed/ Let me know if you need any further help.
    1 point
  39. Solved! I've found the problem: I think there is a config issue on default Intel QNAP TCP config (factory default): Default Receive TCP Window on Intel QNAP = net.ipv4.tcp_rmem = "4096 1638400 1638400" Default Receive TCP Window on ARM QNAP = net.ipv4.tcp_rmem="32768 87380 16777216" I think Intel's config had a too low minimal window As soon as I changed the value on the Intel to the values on the ARM, ssh data transfer went double and beyond So I tweaked the TCP Windows (send and receive), and got my beloved 1Gbps per job code (ssh root on QNAP ) sysctl -w net.ipv4.tcp_rmem="131072 524288 16777216" sysctl -w net.ipv4.tcp_wmem="131072 524288 16777216" Also, I think it's active on default, but not a bad idea forcing autotuning: sysctl -w net.ipv4.tcp_window_scaling=1 I hope it's useful If anyone has the same issue. Just be sure to include the tweaks on autorun.sh to survive reboots Thanks
    1 point
  40. There were mistakes on my end - I didn't realise I was reading out-of-date documentation, as the documentation only seems have version number on the front page, and I arrived at the info via direct link of a Google search. Even with the most current documentation, I think many long-term users will struggle to understand fully what is going on with these changes. Thanks for following up.
    1 point
  41. Hi Gavino i can fully understand your annoyance with the documentation and the labelling in the gui. I have already asked some questions about the new storage system, see the following link:
    1 point
  42. I have a question (suggestion) regarding licensing. Why don't you make an independent license server that the Directors could rely on regardless of the installation. Example: because I will have more than 800 transporters in a near future, I installed a Multi-tenant to separate eleven "Tenant". With an independent license server I could have installed several single-tenant Directors, one for each organization With this architecture I secure my Directors (today I lose the multi-tenant (crash, cyber attack, ...), I block all my backups and restore) Same, I facilitate the update (not a bigbang). sorry for my aproximative English.
    1 point
  43. 1 point
  44. This is absolutely crucial these days. Also, filtering web access based on IP subnet, if possible (I know, we can also do that in the firewall...)
    1 point
  45. If I have Nakivo installed and running on a vm how do i back it up? I noticed that if i try to include the nakivo vm in any existing backup jobs i run into issues.
    1 point
  46. The RDX is a network device, you see the web interface here https://youtu.be/0Bvzh_RsBPM?t=1236. It looks like each drive or volume can be a iscsi target that could be connected to the nakivo server and then a backup repository could be created on those iscsi drive targets right?
    1 point
  47. i still had this issue ... the support told me to exclude the backup folder. but they still did not answer why the files was captured as false positive virus files....
    1 point
  48. Also, I can't seem to deploy the appliance via the vCenter 7 UI, I have to deploy it via the host UI. I've tried both to a cluster (running DRS) and to an individual host. It just says 'fatal error' when validating the compute resource.
    1 point
  49. yea ... i found it ... ill update my sites now to obtain this function. thanks.
    1 point
×
×
  • Create New...