Jump to content
NAKIVO Community Forum

TonioRoffo

Members
  • Posts

    8
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by TonioRoffo

  1. Hello, A competitor supports "immutable" backups to a hardened linux backup repository by leveraging the immutable flag in the XFS filesystem. Is this something that is considered for future Nakivo versions? We would like to have safe backups, even if the Nakivo console would be compromised by hackers. Thanks!
  2. Is there any changes on the horizon for this? I'm looking for a way to do "forever incremental" backups to cloud repositories - I want to avoid periodic "full" backups. Thanks
  3. The shortest way to integrate this without going vendor specific, is adding SAML 2.0. This would allow logins based on Microsoft 365 or other MFA vendors.
  4. Thank you, I understand that Active full will be needed for cloud backup. This impacts my use case for Nakivo for clients that are on a smaller budget, which do no have the upload capacity for full active backups. I'm hoping someday the "Forever incremental" method will be changed or leveraged to be able to be used with S3 compatible storage (if needed, with a transporter on EC2 doing the work) It's so much more effective to store data in the cloud in this way. Backup software (non virtual machines though) with similar storage formats do run in the cloud well (Like duplicacy for example) Thank you.
  5. Way overdue response maybe, but for future people looking this up... "Second HDD is independent permanently (sdb1)" Independent disks do not get snapshotted by Vmware, hence they can't be backed up. Not just a Nakivo issue, Competitor will also just ignore these disks in backup. Be warned about this! I do believe backup software should mention an error or warning if such a disk is present and that it is not included in the backup.
  6. Restore bare metal backups back to bare metal (via bootdisk) Backup up bare metal machines is a bit useless if we can't restore to the same hardware. Usually these machines are on bare metal for a reason or we'd P2V them already.
  7. This is absolutely crucial these days. Also, filtering web access based on IP subnet, if possible (I know, we can also do that in the firewall...)
  8. Hello, I'd like to inquire what the best way is to set up backup to cloud - Amazon S3. I don't have much Amazon S3 experience but would like to keep the backups fast & the cost low, and I'm not sure how to proceed. * If I set up backup with incrementals & weekly full backups, my S3 costs will surely be low because I don't download anything. However, my weekend backups will be fulls and take considerable time. Correct? * If I set up backup with incrementals & synthetic fulls, does that work well with S3? Will my backups be faster or will S3 use my local transporters to merge the backups and create high costs due to downloading from S3? * In a local test (not to S3) I saw that backup copy jobs ignore the synthetic setting and revert to full backups - in normal backup jobs it works. Is this true for S3 also? I'd love to read some advice from people running backups to S3 at this moment so I make the right choice? Thanks!
×
×
  • Create New...