是“双散列”密码不仅仅是哈希一次?

在存储之前两次哈希密码是否比一次哈希一次更安全?

我在说的是这样做的:

$hashed_password = hash(hash($plaintext_password)); 

而不仅仅是这个:

 $hashed_password = hash($plaintext_password); 

如果不太安全,你能提供一个很好的解释(或链接到一个)?

此外,使用的散列函数是否有所作为? 如果你混合使用md5和sha1(而不是重复相同的散列函数),它会有什么不同吗?

注1:当我说“双重散列”时,我正在谈论两次散列密码,以使其更加模糊。 我不是在谈论解决冲突的技巧 。

注2:我知道我需要添加一个随机盐,以确保安全。 问题是用相同的algorithm两次哈希是否有助于或伤害哈希。

散列密码一次是不安全的

不,多重哈希不是不那么安全; 它们是安全密码使用的重要组成部分。

迭代散列会增加攻击者在候选列表中尝试每个密码所需的时间。 您可以轻松地将攻击密码的时间从几小时增加到几年。

简单的迭代是不够的

仅将哈希输出链接到input对于安全性来说是不够的。 迭代应该在保留密码熵的algorithm的上下文中进行。 幸运的是,有几个已经发布的algorithm已经有了足够的审查,给他们的devise信心。

像PBKDF2这样的密钥导出algorithm将密码input到每一轮哈希中,从而减轻了对哈希输出中的冲突的担忧。 PBKDF2可以原样用于密码authentication。 Bcrypt通过encryption步骤跟随密钥推导; 这样,如果发现一个快速的方法来反转密钥派生,攻击者仍然必须完成已知的明文攻击。

如何破解密码

存储的密码需要防止脱机攻击。 如果密码不被腌制,可以用预先计算的字典攻击(例如使用彩虹表)来破解密码。 否则,攻击者必须花时间计算每个密码的散列值,并查看它是否与存储的散列值相匹配。

所有密码的可能性不大。 攻击者可能会彻底search所有短密码,但是他们知道,每一个额外的angular色,他们的暴力成功机会急剧下降。 相反,他们使用最可能的密码的有序列表。 他们从“password123”开始,进入不经常使用的密码。

比方说,攻击者名单很长,有100亿候选人; 假设桌面系统每秒可以计算一百万次哈希值。 如果只使用一次迭代,攻击者可以testing她的整个列表不到三个小时。 但是如果只使用2000次迭代,那么这个时间会延长到近8个月。 为了击败一个更复杂的攻击者,一个能够下载一个可以挖掘他们的GPUfunction的程序,你需要更多的迭代。

多less钱就够了?

要使用的迭代次数是安全性和用户体验之间的折衷。 攻击者可以使用的专门硬件价格便宜,但仍然可以每秒执行数亿次迭代。 攻击者系统的性能决定了在多次迭代中破解密码需要多长时间。 但是你的应用程序不太可能使用这个专门的硬件。 在不加重用户的情况下,您可以执行多less次迭代取决于您的系统。

您可能可以让用户在authentication过程中等待大约3/4秒左右。 剖析您的目标平台,并尽可能多地使用迭代。 我testing过的平台(移动设备上的一个用户,或服务器平台上的许多用户)可以轻松地支持PBKDF2 ,其迭代次数为60,000到120,000次,或者使用12或13的成本因子进行encryption 。

更多的背景

阅读PKCS#5,以获取有关散列中盐和迭代的作用的权威信息。 尽pipePBKDF2是用来从密码生成encryption密钥的,但它可以很好地用作密码validation的单向散列。 bcrypt的每次迭代都比SHA-2哈希更昂贵,所以你可以使用更less的迭代,但是这个想法是相同的。 通过使用派生密钥来encryption众所周知的纯文本,Bcrypt也超越了大多数基于PBKDF2的解决scheme。 得到的密文被存储为“散列”,以及一些元数据。 但是,没有什么能阻止你用PBKDF2做同样的事情。

以下是我在这个主题上写的其他答案:

  • 散列密码
  • 散列密码
  • 隐藏盐
  • PBKDF2与bcrypt
  • Bcrypt

对于那些说安全的人来说,他们是正确 。 对于特定的问题,“双重”散列(或者是逻辑扩展,迭代散列函数)是绝对安全的。

对于那些说不安全的人来说, 在这种情况下是正确的。 在问题中发布的代码不安全的。 我们来谈谈为什么:

 $hashed_password1 = md5( md5( plaintext_password ) ); $hashed_password2 = md5( plaintext_password ); 

我们关心的哈希函数有两个基本属性:

  1. 前映像电阻 – 给定一个散列$h ,应该很难find一条消息$m例如$h === hash($m)

  2. 第二预映像阻力 – 给出消息$m1 ,应该很难find不同的消息$m2这样hash($m1) === hash($m2)

  3. 碰撞阻力 – 应该很难find一对消息($m1, $m2) ,使得hash($m1) === hash($m2) (注意这与第二预映象阻力相似,但是不同的是在这里攻击者可以控制两个消息)…

为了存储密码 ,我们真正关心的是预映像电阻 。 另外两个是没有意义的,因为$m1是我们试图保证安全的用户密码。 所以如果攻击者已经拥有它,这个散列没有什么可保护的

免责声明

接下来的一切都是基于我们所关心的是前期影像的前提。 散列函数的其他两个基本属性可能不会(通常不会)以相同的方式保持。 所以本文的结论只适用于使用散列函数来存储密码。 他们通常不适用…

让我们开始吧

为了这个讨论,我们来创build我们自己的哈希函数:

 function ourHash($input) { $result = 0; for ($i = 0; $i < strlen($input); $i++) { $result += ord($input[$i]); } return (string) ($result % 256); } 

现在应该是非常明显这个散列函数。 它将input的每个字符的ASCII值汇总在一起,然后以256结果取模。

那么我们来testing一下:

 var_dump( ourHash('abc'), // string(2) "38" ourHash('def'), // string(2) "47" ourHash('hij'), // string(2) "59" ourHash('klm') // string(2) "68" ); 

现在,让我们看看如果我们围绕一个函数运行几次会发生什么:

 $tests = array( "abc", "def", "hij", "klm", ); foreach ($tests as $test) { $hash = $test; for ($i = 0; $i < 100; $i++) { $hash = ourHash($hash); } echo "Hashing $test => $hash\n"; } 

输出:

 Hashing abc => 152 Hashing def => 152 Hashing hij => 155 Hashing klm => 155 

哇,哇。 我们已经产生了碰撞! 我们来看看为什么:

下面是散列每个可能的散列输出的string的输出:

 Hashing 0 => 48 Hashing 1 => 49 Hashing 2 => 50 Hashing 3 => 51 Hashing 4 => 52 Hashing 5 => 53 Hashing 6 => 54 Hashing 7 => 55 Hashing 8 => 56 Hashing 9 => 57 Hashing 10 => 97 Hashing 11 => 98 Hashing 12 => 99 Hashing 13 => 100 Hashing 14 => 101 Hashing 15 => 102 Hashing 16 => 103 Hashing 17 => 104 Hashing 18 => 105 Hashing 19 => 106 Hashing 20 => 98 Hashing 21 => 99 Hashing 22 => 100 Hashing 23 => 101 Hashing 24 => 102 Hashing 25 => 103 Hashing 26 => 104 Hashing 27 => 105 Hashing 28 => 106 Hashing 29 => 107 Hashing 30 => 99 Hashing 31 => 100 Hashing 32 => 101 Hashing 33 => 102 Hashing 34 => 103 Hashing 35 => 104 Hashing 36 => 105 Hashing 37 => 106 Hashing 38 => 107 Hashing 39 => 108 Hashing 40 => 100 Hashing 41 => 101 Hashing 42 => 102 Hashing 43 => 103 Hashing 44 => 104 Hashing 45 => 105 Hashing 46 => 106 Hashing 47 => 107 Hashing 48 => 108 Hashing 49 => 109 Hashing 50 => 101 Hashing 51 => 102 Hashing 52 => 103 Hashing 53 => 104 Hashing 54 => 105 Hashing 55 => 106 Hashing 56 => 107 Hashing 57 => 108 Hashing 58 => 109 Hashing 59 => 110 Hashing 60 => 102 Hashing 61 => 103 Hashing 62 => 104 Hashing 63 => 105 Hashing 64 => 106 Hashing 65 => 107 Hashing 66 => 108 Hashing 67 => 109 Hashing 68 => 110 Hashing 69 => 111 Hashing 70 => 103 Hashing 71 => 104 Hashing 72 => 105 Hashing 73 => 106 Hashing 74 => 107 Hashing 75 => 108 Hashing 76 => 109 Hashing 77 => 110 Hashing 78 => 111 Hashing 79 => 112 Hashing 80 => 104 Hashing 81 => 105 Hashing 82 => 106 Hashing 83 => 107 Hashing 84 => 108 Hashing 85 => 109 Hashing 86 => 110 Hashing 87 => 111 Hashing 88 => 112 Hashing 89 => 113 Hashing 90 => 105 Hashing 91 => 106 Hashing 92 => 107 Hashing 93 => 108 Hashing 94 => 109 Hashing 95 => 110 Hashing 96 => 111 Hashing 97 => 112 Hashing 98 => 113 Hashing 99 => 114 Hashing 100 => 145 Hashing 101 => 146 Hashing 102 => 147 Hashing 103 => 148 Hashing 104 => 149 Hashing 105 => 150 Hashing 106 => 151 Hashing 107 => 152 Hashing 108 => 153 Hashing 109 => 154 Hashing 110 => 146 Hashing 111 => 147 Hashing 112 => 148 Hashing 113 => 149 Hashing 114 => 150 Hashing 115 => 151 Hashing 116 => 152 Hashing 117 => 153 Hashing 118 => 154 Hashing 119 => 155 Hashing 120 => 147 Hashing 121 => 148 Hashing 122 => 149 Hashing 123 => 150 Hashing 124 => 151 Hashing 125 => 152 Hashing 126 => 153 Hashing 127 => 154 Hashing 128 => 155 Hashing 129 => 156 Hashing 130 => 148 Hashing 131 => 149 Hashing 132 => 150 Hashing 133 => 151 Hashing 134 => 152 Hashing 135 => 153 Hashing 136 => 154 Hashing 137 => 155 Hashing 138 => 156 Hashing 139 => 157 Hashing 140 => 149 Hashing 141 => 150 Hashing 142 => 151 Hashing 143 => 152 Hashing 144 => 153 Hashing 145 => 154 Hashing 146 => 155 Hashing 147 => 156 Hashing 148 => 157 Hashing 149 => 158 Hashing 150 => 150 Hashing 151 => 151 Hashing 152 => 152 Hashing 153 => 153 Hashing 154 => 154 Hashing 155 => 155 Hashing 156 => 156 Hashing 157 => 157 Hashing 158 => 158 Hashing 159 => 159 Hashing 160 => 151 Hashing 161 => 152 Hashing 162 => 153 Hashing 163 => 154 Hashing 164 => 155 Hashing 165 => 156 Hashing 166 => 157 Hashing 167 => 158 Hashing 168 => 159 Hashing 169 => 160 Hashing 170 => 152 Hashing 171 => 153 Hashing 172 => 154 Hashing 173 => 155 Hashing 174 => 156 Hashing 175 => 157 Hashing 176 => 158 Hashing 177 => 159 Hashing 178 => 160 Hashing 179 => 161 Hashing 180 => 153 Hashing 181 => 154 Hashing 182 => 155 Hashing 183 => 156 Hashing 184 => 157 Hashing 185 => 158 Hashing 186 => 159 Hashing 187 => 160 Hashing 188 => 161 Hashing 189 => 162 Hashing 190 => 154 Hashing 191 => 155 Hashing 192 => 156 Hashing 193 => 157 Hashing 194 => 158 Hashing 195 => 159 Hashing 196 => 160 Hashing 197 => 161 Hashing 198 => 162 Hashing 199 => 163 Hashing 200 => 146 Hashing 201 => 147 Hashing 202 => 148 Hashing 203 => 149 Hashing 204 => 150 Hashing 205 => 151 Hashing 206 => 152 Hashing 207 => 153 Hashing 208 => 154 Hashing 209 => 155 Hashing 210 => 147 Hashing 211 => 148 Hashing 212 => 149 Hashing 213 => 150 Hashing 214 => 151 Hashing 215 => 152 Hashing 216 => 153 Hashing 217 => 154 Hashing 218 => 155 Hashing 219 => 156 Hashing 220 => 148 Hashing 221 => 149 Hashing 222 => 150 Hashing 223 => 151 Hashing 224 => 152 Hashing 225 => 153 Hashing 226 => 154 Hashing 227 => 155 Hashing 228 => 156 Hashing 229 => 157 Hashing 230 => 149 Hashing 231 => 150 Hashing 232 => 151 Hashing 233 => 152 Hashing 234 => 153 Hashing 235 => 154 Hashing 236 => 155 Hashing 237 => 156 Hashing 238 => 157 Hashing 239 => 158 Hashing 240 => 150 Hashing 241 => 151 Hashing 242 => 152 Hashing 243 => 153 Hashing 244 => 154 Hashing 245 => 155 Hashing 246 => 156 Hashing 247 => 157 Hashing 248 => 158 Hashing 249 => 159 Hashing 250 => 151 Hashing 251 => 152 Hashing 252 => 153 Hashing 253 => 154 Hashing 254 => 155 Hashing 255 => 156 

注意到更高数字的趋势。 结果是我们的失败。 对每个元素运行哈希4次($ hash = ourHash($ hash)`)给我们:

 Hashing 0 => 153 Hashing 1 => 154 Hashing 2 => 155 Hashing 3 => 156 Hashing 4 => 157 Hashing 5 => 158 Hashing 6 => 150 Hashing 7 => 151 Hashing 8 => 152 Hashing 9 => 153 Hashing 10 => 157 Hashing 11 => 158 Hashing 12 => 150 Hashing 13 => 154 Hashing 14 => 155 Hashing 15 => 156 Hashing 16 => 157 Hashing 17 => 158 Hashing 18 => 150 Hashing 19 => 151 Hashing 20 => 158 Hashing 21 => 150 Hashing 22 => 154 Hashing 23 => 155 Hashing 24 => 156 Hashing 25 => 157 Hashing 26 => 158 Hashing 27 => 150 Hashing 28 => 151 Hashing 29 => 152 Hashing 30 => 150 Hashing 31 => 154 Hashing 32 => 155 Hashing 33 => 156 Hashing 34 => 157 Hashing 35 => 158 Hashing 36 => 150 Hashing 37 => 151 Hashing 38 => 152 Hashing 39 => 153 Hashing 40 => 154 Hashing 41 => 155 Hashing 42 => 156 Hashing 43 => 157 Hashing 44 => 158 Hashing 45 => 150 Hashing 46 => 151 Hashing 47 => 152 Hashing 48 => 153 Hashing 49 => 154 Hashing 50 => 155 Hashing 51 => 156 Hashing 52 => 157 Hashing 53 => 158 Hashing 54 => 150 Hashing 55 => 151 Hashing 56 => 152 Hashing 57 => 153 Hashing 58 => 154 Hashing 59 => 155 Hashing 60 => 156 Hashing 61 => 157 Hashing 62 => 158 Hashing 63 => 150 Hashing 64 => 151 Hashing 65 => 152 Hashing 66 => 153 Hashing 67 => 154 Hashing 68 => 155 Hashing 69 => 156 Hashing 70 => 157 Hashing 71 => 158 Hashing 72 => 150 Hashing 73 => 151 Hashing 74 => 152 Hashing 75 => 153 Hashing 76 => 154 Hashing 77 => 155 Hashing 78 => 156 Hashing 79 => 157 Hashing 80 => 158 Hashing 81 => 150 Hashing 82 => 151 Hashing 83 => 152 Hashing 84 => 153 Hashing 85 => 154 Hashing 86 => 155 Hashing 87 => 156 Hashing 88 => 157 Hashing 89 => 158 Hashing 90 => 150 Hashing 91 => 151 Hashing 92 => 152 Hashing 93 => 153 Hashing 94 => 154 Hashing 95 => 155 Hashing 96 => 156 Hashing 97 => 157 Hashing 98 => 158 Hashing 99 => 150 Hashing 100 => 154 Hashing 101 => 155 Hashing 102 => 156 Hashing 103 => 157 Hashing 104 => 158 Hashing 105 => 150 Hashing 106 => 151 Hashing 107 => 152 Hashing 108 => 153 Hashing 109 => 154 Hashing 110 => 155 Hashing 111 => 156 Hashing 112 => 157 Hashing 113 => 158 Hashing 114 => 150 Hashing 115 => 151 Hashing 116 => 152 Hashing 117 => 153 Hashing 118 => 154 Hashing 119 => 155 Hashing 120 => 156 Hashing 121 => 157 Hashing 122 => 158 Hashing 123 => 150 Hashing 124 => 151 Hashing 125 => 152 Hashing 126 => 153 Hashing 127 => 154 Hashing 128 => 155 Hashing 129 => 156 Hashing 130 => 157 Hashing 131 => 158 Hashing 132 => 150 Hashing 133 => 151 Hashing 134 => 152 Hashing 135 => 153 Hashing 136 => 154 Hashing 137 => 155 Hashing 138 => 156 Hashing 139 => 157 Hashing 140 => 158 Hashing 141 => 150 Hashing 142 => 151 Hashing 143 => 152 Hashing 144 => 153 Hashing 145 => 154 Hashing 146 => 155 Hashing 147 => 156 Hashing 148 => 157 Hashing 149 => 158 Hashing 150 => 150 Hashing 151 => 151 Hashing 152 => 152 Hashing 153 => 153 Hashing 154 => 154 Hashing 155 => 155 Hashing 156 => 156 Hashing 157 => 157 Hashing 158 => 158 Hashing 159 => 159 Hashing 160 => 151 Hashing 161 => 152 Hashing 162 => 153 Hashing 163 => 154 Hashing 164 => 155 Hashing 165 => 156 Hashing 166 => 157 Hashing 167 => 158 Hashing 168 => 159 Hashing 169 => 151 Hashing 170 => 152 Hashing 171 => 153 Hashing 172 => 154 Hashing 173 => 155 Hashing 174 => 156 Hashing 175 => 157 Hashing 176 => 158 Hashing 177 => 159 Hashing 178 => 151 Hashing 179 => 152 Hashing 180 => 153 Hashing 181 => 154 Hashing 182 => 155 Hashing 183 => 156 Hashing 184 => 157 Hashing 185 => 158 Hashing 186 => 159 Hashing 187 => 151 Hashing 188 => 152 Hashing 189 => 153 Hashing 190 => 154 Hashing 191 => 155 Hashing 192 => 156 Hashing 193 => 157 Hashing 194 => 158 Hashing 195 => 159 Hashing 196 => 151 Hashing 197 => 152 Hashing 198 => 153 Hashing 199 => 154 Hashing 200 => 155 Hashing 201 => 156 Hashing 202 => 157 Hashing 203 => 158 Hashing 204 => 150 Hashing 205 => 151 Hashing 206 => 152 Hashing 207 => 153 Hashing 208 => 154 Hashing 209 => 155 Hashing 210 => 156 Hashing 211 => 157 Hashing 212 => 158 Hashing 213 => 150 Hashing 214 => 151 Hashing 215 => 152 Hashing 216 => 153 Hashing 217 => 154 Hashing 218 => 155 Hashing 219 => 156 Hashing 220 => 157 Hashing 221 => 158 Hashing 222 => 150 Hashing 223 => 151 Hashing 224 => 152 Hashing 225 => 153 Hashing 226 => 154 Hashing 227 => 155 Hashing 228 => 156 Hashing 229 => 157 Hashing 230 => 158 Hashing 231 => 150 Hashing 232 => 151 Hashing 233 => 152 Hashing 234 => 153 Hashing 235 => 154 Hashing 236 => 155 Hashing 237 => 156 Hashing 238 => 157 Hashing 239 => 158 Hashing 240 => 150 Hashing 241 => 151 Hashing 242 => 152 Hashing 243 => 153 Hashing 244 => 154 Hashing 245 => 155 Hashing 246 => 156 Hashing 247 => 157 Hashing 248 => 158 Hashing 249 => 159 Hashing 250 => 151 Hashing 251 => 152 Hashing 252 => 153 Hashing 253 => 154 Hashing 254 => 155 Hashing 255 => 156 

我们已经把自己缩小到了8个值…这是坏的 …我们的原始函数将S(∞)映射到S(256) 。 那就是我们创build了一个映射$input$output的Surjective Function 。

由于我们有一个输出函数,所以我们不能保证任何input子集的映射都不会有冲突(事实上,实际上他们会这样做)。

这就是发生在这里! 我们的function不好,但这不是为什么这个工作(这就是为什么它的工作如此之快,如此完整)。

MD5发生同样的事情。 它将S(∞)映射到S(2^128) 。 由于不能保证运行MD5(S(output))是内射的 ,这意味着它不会有冲突。

TL / DR部分

因此,由于直接将输出反馈给md5会产生冲突,每次迭代都会增加碰撞的机会。 然而,这是一个线性增长,这意味着虽然2^128的结果集减less了,但是并没有显着减less到足够严重的缺陷。

所以,

 $output = md5($input); // 2^128 possibilities $output = md5($output); // < 2^128 possibilities $output = md5($output); // < 2^128 possibilities $output = md5($output); // < 2^128 possibilities $output = md5($output); // < 2^128 possibilities 

迭代的次数越多,减less越多。

修正

幸运的是,对于我们来说,解决这个问题的方法是微不足道的 :将一些东西反馈到进一步的迭代中:

 $output = md5($input); // 2^128 possibilities $output = md5($input . $output); // 2^128 possibilities $output = md5($input . $output); // 2^128 possibilities $output = md5($input . $output); // 2^128 possibilities $output = md5($input . $output); // 2^128 possibilities 

请注意,对于$input ,每个单独的值的进一步迭代不是2 ^ 128。 这意味着我们可能能够生成$input值,这些$input值仍然会沿着线路冲突(因此将会远远less于2^128可能的输出来解决或共振)。 但是, $input的一般情况仍然和单轮一样强劲。

等等,是吗? 我们用ourHash()函数testing一下。 切换到$hash = ourHash($input . $hash); ,进行100次迭代:

 Hashing 0 => 201 Hashing 1 => 212 Hashing 2 => 199 Hashing 3 => 201 Hashing 4 => 203 Hashing 5 => 205 Hashing 6 => 207 Hashing 7 => 209 Hashing 8 => 211 Hashing 9 => 204 Hashing 10 => 251 Hashing 11 => 147 Hashing 12 => 251 Hashing 13 => 148 Hashing 14 => 253 Hashing 15 => 0 Hashing 16 => 1 Hashing 17 => 2 Hashing 18 => 161 Hashing 19 => 163 Hashing 20 => 147 Hashing 21 => 251 Hashing 22 => 148 Hashing 23 => 253 Hashing 24 => 0 Hashing 25 => 1 Hashing 26 => 2 Hashing 27 => 161 Hashing 28 => 163 Hashing 29 => 8 Hashing 30 => 251 Hashing 31 => 148 Hashing 32 => 253 Hashing 33 => 0 Hashing 34 => 1 Hashing 35 => 2 Hashing 36 => 161 Hashing 37 => 163 Hashing 38 => 8 Hashing 39 => 4 Hashing 40 => 148 Hashing 41 => 253 Hashing 42 => 0 Hashing 43 => 1 Hashing 44 => 2 Hashing 45 => 161 Hashing 46 => 163 Hashing 47 => 8 Hashing 48 => 4 Hashing 49 => 9 Hashing 50 => 253 Hashing 51 => 0 Hashing 52 => 1 Hashing 53 => 2 Hashing 54 => 161 Hashing 55 => 163 Hashing 56 => 8 Hashing 57 => 4 Hashing 58 => 9 Hashing 59 => 11 Hashing 60 => 0 Hashing 61 => 1 Hashing 62 => 2 Hashing 63 => 161 Hashing 64 => 163 Hashing 65 => 8 Hashing 66 => 4 Hashing 67 => 9 Hashing 68 => 11 Hashing 69 => 4 Hashing 70 => 1 Hashing 71 => 2 Hashing 72 => 161 Hashing 73 => 163 Hashing 74 => 8 Hashing 75 => 4 Hashing 76 => 9 Hashing 77 => 11 Hashing 78 => 4 Hashing 79 => 3 Hashing 80 => 2 Hashing 81 => 161 Hashing 82 => 163 Hashing 83 => 8 Hashing 84 => 4 Hashing 85 => 9 Hashing 86 => 11 Hashing 87 => 4 Hashing 88 => 3 Hashing 89 => 17 Hashing 90 => 161 Hashing 91 => 163 Hashing 92 => 8 Hashing 93 => 4 Hashing 94 => 9 Hashing 95 => 11 Hashing 96 => 4 Hashing 97 => 3 Hashing 98 => 17 Hashing 99 => 13 Hashing 100 => 246 Hashing 101 => 248 Hashing 102 => 49 Hashing 103 => 44 Hashing 104 => 255 Hashing 105 => 198 Hashing 106 => 43 Hashing 107 => 51 Hashing 108 => 202 Hashing 109 => 2 Hashing 110 => 248 Hashing 111 => 49 Hashing 112 => 44 Hashing 113 => 255 Hashing 114 => 198 Hashing 115 => 43 Hashing 116 => 51 Hashing 117 => 202 Hashing 118 => 2 Hashing 119 => 51 Hashing 120 => 49 Hashing 121 => 44 Hashing 122 => 255 Hashing 123 => 198 Hashing 124 => 43 Hashing 125 => 51 Hashing 126 => 202 Hashing 127 => 2 Hashing 128 => 51 Hashing 129 => 53 Hashing 130 => 44 Hashing 131 => 255 Hashing 132 => 198 Hashing 133 => 43 Hashing 134 => 51 Hashing 135 => 202 Hashing 136 => 2 Hashing 137 => 51 Hashing 138 => 53 Hashing 139 => 55 Hashing 140 => 255 Hashing 141 => 198 Hashing 142 => 43 Hashing 143 => 51 Hashing 144 => 202 Hashing 145 => 2 Hashing 146 => 51 Hashing 147 => 53 Hashing 148 => 55 Hashing 149 => 58 Hashing 150 => 198 Hashing 151 => 43 Hashing 152 => 51 Hashing 153 => 202 Hashing 154 => 2 Hashing 155 => 51 Hashing 156 => 53 Hashing 157 => 55 Hashing 158 => 58 Hashing 159 => 0 Hashing 160 => 43 Hashing 161 => 51 Hashing 162 => 202 Hashing 163 => 2 Hashing 164 => 51 Hashing 165 => 53 Hashing 166 => 55 Hashing 167 => 58 Hashing 168 => 0 Hashing 169 => 209 Hashing 170 => 51 Hashing 171 => 202 Hashing 172 => 2 Hashing 173 => 51 Hashing 174 => 53 Hashing 175 => 55 Hashing 176 => 58 Hashing 177 => 0 Hashing 178 => 209 Hashing 179 => 216 Hashing 180 => 202 Hashing 181 => 2 Hashing 182 => 51 Hashing 183 => 53 Hashing 184 => 55 Hashing 185 => 58 Hashing 186 => 0 Hashing 187 => 209 Hashing 188 => 216 Hashing 189 => 219 Hashing 190 => 2 Hashing 191 => 51 Hashing 192 => 53 Hashing 193 => 55 Hashing 194 => 58 Hashing 195 => 0 Hashing 196 => 209 Hashing 197 => 216 Hashing 198 => 219 Hashing 199 => 220 Hashing 200 => 248 Hashing 201 => 49 Hashing 202 => 44 Hashing 203 => 255 Hashing 204 => 198 Hashing 205 => 43 Hashing 206 => 51 Hashing 207 => 202 Hashing 208 => 2 Hashing 209 => 51 Hashing 210 => 49 Hashing 211 => 44 Hashing 212 => 255 Hashing 213 => 198 Hashing 214 => 43 Hashing 215 => 51 Hashing 216 => 202 Hashing 217 => 2 Hashing 218 => 51 Hashing 219 => 53 Hashing 220 => 44 Hashing 221 => 255 Hashing 222 => 198 Hashing 223 => 43 Hashing 224 => 51 Hashing 225 => 202 Hashing 226 => 2 Hashing 227 => 51 Hashing 228 => 53 Hashing 229 => 55 Hashing 230 => 255 Hashing 231 => 198 Hashing 232 => 43 Hashing 233 => 51 Hashing 234 => 202 Hashing 235 => 2 Hashing 236 => 51 Hashing 237 => 53 Hashing 238 => 55 Hashing 239 => 58 Hashing 240 => 198 Hashing 241 => 43 Hashing 242 => 51 Hashing 243 => 202 Hashing 244 => 2 Hashing 245 => 51 Hashing 246 => 53 Hashing 247 => 55 Hashing 248 => 58 Hashing 249 => 0 Hashing 250 => 43 Hashing 251 => 51 Hashing 252 => 202 Hashing 253 => 2 Hashing 254 => 51 Hashing 255 => 53 

这里仍然有一个粗略的模式,但是请注意,它不像我们的底层函数(已经非常弱)那样不是一个模式。

不过要注意的是, 03变成了碰撞,尽pipe它们并不是单次运行。 这是我以前说过的一个应用(对于所有input的集合,碰撞阻力保持不变,但由于底层algorithm中的缺陷,特定的碰撞path可能打开)。

TL / DR部分

通过将input反馈到每个迭代中,我们有效地打破了先前迭代中可能发生的任何冲突。

因此, md5($input . md5($input)); 应该(至less理论上 )与md5($input)一样强。

这是重要的吗?

是。 这是PBKDF2replaceRFC 2898中的 PBKDF1的原因之一。 考虑两个内部循环::

PBKDF1:

 T_1 = Hash (P || S) , T_2 = Hash (T_1) , ... T_c = Hash (T_{c-1}) 

其中c是迭代计数, P是密码, S是盐

PBKDF2:

 U_1 = PRF (P, S || INT (i)) , U_2 = PRF (P, U_1) , ... U_c = PRF (P, U_{c-1}) 

PRF真的只是一个HMAC。 但是就我们这里的目的而言,我们只要说PRF(P, S) = Hash(P || S) (也就是说,两个input的PRF大致相同,就如同两个串联在一起的散列)。 这是非常不是 ,但为我们的目的是。

所以PBKDF2保持了潜在的Hash函数的碰撞阻力,其中PBKDF1没有。

把所有的东西结合在一起:

我们知道迭代散列的安全方法。 事实上:

 $hash = $input; $i = 10000; do { $hash = hash($input . $hash); } while ($i-- > 0); 

通常是安全的。

现在,进入为什么我们要散列它,让我们来分析熵运动。

散列取无穷集: S(∞)并产生一个更小的,一致大小的集合S(n) 。 下一次迭代(假设input传回S(n)再次将S(∞)映射到S(n)

 S(∞) -> S(n) S(∞) -> S(n) S(∞) -> S(n) S(∞) -> S(n) S(∞) -> S(n) S(∞) -> S(n) 

请注意,最终输出的熵与第一个熵完全相同 。 迭代不会 “让它变得更加模糊”。 熵是相同的。 没有不可预测性的魔力来源(这是一个伪随机函数,而不是随机函数)。

然而迭代有一个好处。 这使得哈希过程人为地变慢。 这就是为什么迭代可以是一个好主意。 实际上,这是大多数现代密码哈希algorithm的基本原理(反复做一些事情使得它更慢)。

慢是好,因为它是对付主要的安全威胁:蛮力。 我们使用哈希algorithm的速度越慢,攻击者就必须努力攻击从我们那里窃取的密码哈希。 这是一件好事!

是的,重新哈希缩小了search空间,但是没有,没关系 – 有效的减less是微不足道的。

重新哈希增加了蛮力所需的时间,但这样做只有两次也不是最理想的。

你真正想要的是用PBKDF2来散列密码 – 这是一个经过validation的使用salt和迭代的安全散列的方法。 看看这个SO响应 。

编辑 :我差点忘了 – 不要使用MD5! 使用现代encryption散列,如SHA-2系列(SHA-256,SHA-384和SHA-512)。

是的 – 它减less了匹配string的可能string的数量。

正如你已经提到的,腌制的哈希好多了。

这里的一篇文章: http : //websecurity.ro/blog/2007/11/02/md5md5-vs-md5/ ,试图certificate为什么它是相等的,但我不确定的逻辑。 部分他们认为没有可用于分析md5(md5(文本))的软件,但显然生成彩虹表相当微不足道。

I'm still sticking with my answer that there are smaller number of md5(md5(text)) type hashes than md5(text) hashes, increasing the chance of collision (even if still to an unlikely probability) and reducing the search space.

I just look at this from a practical standpoint. What is the hacker after? Why, the combination of characters that, when put through the hash function, generates the desired hash.

You are only saving the last hash, therefore, the hacker only has to bruteforce one hash. Assuming you have roughly the same odds of stumbling across the desired hash with each bruteforce step, the number of hashes is irrelevant. You could do a million hash iterations, and it would not increase or reduce security one bit, since at the end of the line there's still only one hash to break, and the odds of breaking it are the same as any hash.

Maybe the previous posters think that the input is relevant; it's not. As long as whatever you put into the hash function generates the desired hash, it will get you through, correct input or incorrect input.

Now, rainbow tables are another story. Since a rainbow table only carries raw passwords, hashing twice may be a good security measure, since a rainbow table that contains every hash of every hash would be too large.

Of course, I'm only considering the example the OP gave, where it's just a plain-text password being hashed. If you include the username or a salt in the hash, it's a different story; hashing twice is entirely unnecessary, since the rainbow table would already be too large to be practical and contain the right hash.

Anyway, not a security expert here, but that's just what I've figured from my experience.

Personally I wouldn't bother with multiple hashses, but I'd make sure to also hash the UserName (or another User ID field) as well as the password so two users with the same password won't end up with the same hash. Also I'd probably throw some other constant string into the input string too for good measure.

 $hashed_password = md5( "xxx" + "|" + user_name + "|" + plaintext_password); 

Most answers are by people without a background in cryptography or security. 他们错了。 Use a salt, if possible unique per record. MD5/SHA/etc are too fast, the opposite of what you want. PBKDF2 and bcrypt are slower (wich is good) but can be defeated with ASICs/FPGA/GPUs (very afordable nowadays). So a memory-hard algorithm is needed: enter scrypt .

Here's a layman explanation on salts and speed (but not about memory-hard algorithms).

In general, it provides no additional security to double hash or double encrypt something. If you can break the hash once, you can break it again. It usually doesn't hurt security to do this, though.

In your example of using MD5, as you probably know there are some collision issues. "Double Hashing" doesn't really help protect against this, since the same collisions will still result in the same first hash, which you can then MD5 again to get the second hash.

This does protect against dictionary attacks, like those "reverse MD5-databases", but so does salting.

On a tangent, Double encrypting something doesn't provide any additional security because all it does is result in a different key which is a combination of the two keys actually used. So the effort to find the "key" is not doubled because two keys do not actually need to be found. This isn't true for hashing, because the result of the hash is not usually the same length as the original input.

From what I've read, it may actually be recommended to re-hash the password hundreds or thousands of times.

The idea is that if you can make it take more time to encode the password, it's more work for an attacker to run through many guesses to crack the password. That seems to be the advantage to re-hashing — not that it's more cryptographically secure, but it simply takes longer to generate a dictionary attack.

Of course computers get faster all the time, so this advantage diminishes over time (or requires you to increase the iterations).

As several responses in this article suggest, there are some cases where it may improves security and others where it definately hurts it. There is a better solution that will definately improve security. Instead of doubling the number of times you calculate the hash, double the size of your salt, or double the number of bits used int the hash, or do both! Instead of SHA-245, jump up to SHA-512.

Let us assume you use the hashing algorithm: compute rot13, take the first 10 characters. If you do that twice (or even 2000 times) it is possible to make a function that is faster, but which gives the same result (namely just take the first 10 chars).

Likewise it may be possible to make a faster function that gives the same output as a repeated hashing function. So your choice of hashing function is very important: as with the rot13 example it is not given that repeated hashing will improve security. If there is no research saying that the algorithm is designed for recursive use, then it is safer to assume that it will not give you added protection.

That said: For all but the simplest hashing functions it will most likely take cryptography experts to compute the faster functions, so if you are guarding against attackers that do not have access to cryptography experts it is probably safer in practice to use a repeated hashing function.

Double hashing makes sense to me only if I hash the password on the client, and then save the hash (with different salt) of that hash on the server.

That way even if someone hacked his way into the server (thereby ignoring the safety SSL provides), he still can't get to the clear passwords.

Yes he will have the data required to breach into the system, but he wouldn't be able to use that data to compromise outside accounts the user has. And people are known to use the same password for virtually anything.

The only way he could get to the clear passwords is installing a keygen on the client – and that's not your problem anymore.

So in short:

  1. The first hashing on the client protects your users in a 'server breach' scenario.
  2. The second hashing on the server serves to protect your system if someone got a hold of your database backup, so he can't use those passwords to connect to your services.

The concern about reducing the search space is mathematically correct, although the search space remains large enough that for all practical purposes (assuming you use salts), at 2^128. However, since we are talking about passwords, the number of possible 16-character strings (alphanumeric, caps matter, a few symbols thrown in) is roughly 2^98, according to my back-of-the-envelope calculations. So the perceived decrease in the search space is not really relevant.

Aside from that, there really is no difference, cryptographically speaking.

Although there is a crypto primitive called a "hash chain" — a technique that allows you to do some cool tricks, like disclosing a signature key after it's been used, without sacrificing the integrity of the system — given minimal time synchronization, this allows you to cleanly sidestep the problem of initial key distribution. Basically, you precompute a large set of hashes of hashes – h(h(h(h….(h(k))…))) , use the nth value to sign, after a set interval, you send out the key, and sign it using key (n-1). The recepients can now verify that you sent all the previous messages, and no one can fake your signature since the time period for which it is valid has passed.

Re-hashing hundreds of thousands of times like Bill suggests is just a waste of your cpu.. use a longer key if you are concerned about people breaking 128 bits.

Double hashing is ugly because it's more than likely an attacker has built a table to come up with most hashes. Better is to salt your hashes, and mix hashes together. There are also new schemas to "sign" hashes (basically salting), but in a more secure manner.

是。

Absolutely do not use multiple iterations of a conventional hash function, like md5(md5(md5(password))) . At best you will be getting a marginal increase in security (a scheme like this offers hardly any protection against a GPU attack; just pipeline it.) At worst, you're reducing your hash space (and thus security) with every iteration you add. In security, it's wise to assume the worst.

Do use a password has that's been designed by a competent cryptographer to be an effective password hash, and resistant to both brute-force and time-space attacks. These include bcrypt, scrypt, and in some situations PBKDF2. The glibc SHA-256-based hash is also acceptable.

I'm going to go out on a limb and say it's more secure in certain circumstances… don't downvote me yet though!

From a mathematical / cryptographical point of view, it's less secure, for reasons that I'm sure someone else will give you a clearer explanation of than I could.

However , there exist large databases of MD5 hashes, which are more likely to contain the "password" text than the MD5 of it. So by double-hashing you're reducing the effectiveness of those databases.

Of course, if you use a salt then this advantage (disadvantage?) goes away.