==> Building on centiskorch ==> Checking for remote environment... ==> Syncing package to remote host... sending incremental file list created directory packages/dns-lexicon ./ .SRCINFO 1,191 100% 0.00kB/s 0:00:00 1,191 100% 0.00kB/s 0:00:00 (xfr#1, to-chk=3/5) .nvchecker.toml 89 100% 86.91kB/s 0:00:00 89 100% 86.91kB/s 0:00:00 (xfr#2, to-chk=2/5) PKGBUILD 1,751 100% 1.67MB/s 0:00:00 1,751 100% 1.67MB/s 0:00:00 (xfr#3, to-chk=1/5) dns-lexicon-3.18.0-2.log 607 100% 592.77kB/s 0:00:00 607 100% 592.77kB/s 0:00:00 (xfr#4, to-chk=0/5) sent 2,079 bytes received 138 bytes 4,434.00 bytes/sec total size is 3,229 speedup is 1.46 ==> Running extra-riscv64-build -- -d /home/felix/packages/riscv64-pkg-cache:/var/cache/pacman/pkg -l root16 on remote host... ]2;🔵 Container arch-nspawn-3444170 on centiskorch.felixc.at\[?25l:: Synchronizing package databases... core downloading... extra downloading... :: Starting full system upgrade... there is nothing to do [?25h==> Building in chroot for [extra] (riscv64)... ==> Synchronizing chroot copy [/var/lib/archbuild/extra-riscv64/root] -> [root16]...done ==> Making package: dns-lexicon 3.18.0-2 (Sun Jan 5 14:02:08 2025) ==> Retrieving sources...  -> Cloning lexicon git repo... Cloning into bare repository '/home/felix/packages/dns-lexicon/lexicon'... remote: Enumerating objects: 27321, done. remote: Counting objects: 0% (1/307) remote: Counting objects: 1% (4/307) remote: Counting objects: 2% (7/307) remote: Counting objects: 3% (10/307) remote: Counting objects: 4% (13/307) remote: Counting objects: 5% (16/307) remote: Counting objects: 6% (19/307) remote: Counting objects: 7% (22/307) remote: Counting objects: 8% (25/307) remote: Counting objects: 9% (28/307) remote: Counting objects: 10% (31/307) remote: Counting objects: 11% (34/307) remote: Counting objects: 12% (37/307) remote: Counting objects: 13% (40/307) remote: Counting objects: 14% (43/307) remote: Counting objects: 15% (47/307) remote: Counting objects: 16% (50/307) remote: Counting objects: 17% (53/307) remote: Counting objects: 18% (56/307) remote: Counting objects: 19% (59/307) remote: Counting objects: 20% (62/307) remote: Counting objects: 21% (65/307) remote: Counting objects: 22% (68/307) remote: Counting objects: 23% (71/307) remote: Counting objects: 24% (74/307) remote: Counting objects: 25% (77/307) remote: Counting objects: 26% (80/307) remote: Counting objects: 27% (83/307) remote: Counting objects: 28% (86/307) remote: Counting objects: 29% (90/307) remote: Counting objects: 30% (93/307) remote: Counting objects: 31% (96/307) remote: Counting objects: 32% (99/307) remote: Counting objects: 33% (102/307) remote: Counting objects: 34% (105/307) remote: Counting objects: 35% (108/307) remote: Counting objects: 36% (111/307) remote: Counting objects: 37% (114/307) remote: Counting objects: 38% (117/307) remote: Counting objects: 39% (120/307) remote: Counting objects: 40% (123/307) remote: Counting objects: 41% (126/307) remote: Counting objects: 42% (129/307) remote: Counting objects: 43% (133/307) remote: Counting objects: 44% (136/307) remote: Counting objects: 45% (139/307) remote: Counting objects: 46% (142/307) remote: Counting objects: 47% (145/307) remote: Counting objects: 48% (148/307) remote: Counting objects: 49% (151/307) remote: Counting objects: 50% (154/307) remote: Counting objects: 51% (157/307) remote: Counting objects: 52% (160/307) remote: Counting objects: 53% (163/307) remote: Counting objects: 54% (166/307) remote: Counting objects: 55% (169/307) remote: Counting objects: 56% (172/307) remote: Counting objects: 57% (175/307) remote: Counting objects: 58% (179/307) remote: Counting objects: 59% (182/307) remote: Counting objects: 60% (185/307) remote: Counting objects: 61% (188/307) remote: Counting objects: 62% (191/307) remote: Counting objects: 63% (194/307) remote: Counting objects: 64% (197/307) remote: Counting objects: 65% (200/307) remote: Counting objects: 66% (203/307) remote: Counting objects: 67% (206/307) remote: Counting objects: 68% (209/307) remote: Counting objects: 69% (212/307) remote: Counting objects: 70% (215/307) remote: Counting objects: 71% (218/307) remote: Counting objects: 72% (222/307) remote: Counting objects: 73% (225/307) remote: Counting objects: 74% (228/307) remote: Counting objects: 75% (231/307) remote: Counting objects: 76% (234/307) remote: Counting objects: 77% (237/307) remote: Counting objects: 78% (240/307) remote: Counting objects: 79% (243/307) remote: Counting objects: 80% (246/307) remote: Counting objects: 81% (249/307) remote: Counting objects: 82% (252/307) remote: Counting objects: 83% (255/307) remote: Counting objects: 84% (258/307) remote: Counting objects: 85% (261/307) remote: Counting objects: 86% (265/307) remote: Counting objects: 87% (268/307) remote: Counting objects: 88% (271/307) remote: Counting objects: 89% (274/307) remote: Counting objects: 90% (277/307) remote: Counting objects: 91% (280/307) remote: Counting objects: 92% (283/307) remote: Counting objects: 93% (286/307) remote: Counting objects: 94% (289/307) remote: Counting objects: 95% (292/307) remote: Counting objects: 96% (295/307) remote: Counting objects: 97% (298/307) remote: Counting objects: 98% (301/307) remote: Counting objects: 99% (304/307) remote: Counting objects: 100% (307/307) remote: Counting objects: 100% (307/307), done. remote: Compressing objects: 0% (1/154) remote: Compressing objects: 1% (2/154) remote: Compressing objects: 2% (4/154) remote: Compressing objects: 3% (5/154) remote: Compressing objects: 4% (7/154) remote: Compressing objects: 5% (8/154) remote: Compressing objects: 6% (10/154) remote: Compressing objects: 7% (11/154) remote: Compressing objects: 8% (13/154) remote: Compressing objects: 9% (14/154) remote: Compressing objects: 10% (16/154) remote: Compressing objects: 11% (17/154) remote: Compressing objects: 12% (19/154) remote: Compressing objects: 13% (21/154) remote: Compressing objects: 14% (22/154) remote: Compressing objects: 15% (24/154) remote: Compressing objects: 16% (25/154) remote: Compressing objects: 17% (27/154) remote: Compressing objects: 18% (28/154) remote: Compressing objects: 19% (30/154) remote: Compressing objects: 20% (31/154) remote: Compressing objects: 21% (33/154) remote: Compressing objects: 22% (34/154) remote: Compressing objects: 23% (36/154) remote: Compressing objects: 24% (37/154) remote: Compressing objects: 25% (39/154) remote: Compressing objects: 26% (41/154) remote: Compressing objects: 27% (42/154) remote: Compressing objects: 28% (44/154) remote: Compressing objects: 29% (45/154) remote: Compressing objects: 30% (47/154) remote: Compressing objects: 31% (48/154) remote: Compressing objects: 32% (50/154) remote: Compressing objects: 33% (51/154) remote: Compressing objects: 34% (53/154) remote: Compressing objects: 35% (54/154) remote: Compressing objects: 36% (56/154) remote: Compressing objects: 37% (57/154) remote: Compressing objects: 38% (59/154) remote: Compressing objects: 39% (61/154) remote: Compressing objects: 40% (62/154) remote: Compressing objects: 41% (64/154) remote: Compressing objects: 42% (65/154) remote: Compressing objects: 43% (67/154) remote: Compressing objects: 44% (68/154) remote: Compressing objects: 45% (70/154) remote: Compressing objects: 46% (71/154) remote: Compressing objects: 47% (73/154) remote: Compressing objects: 48% (74/154) remote: Compressing objects: 49% (76/154) remote: Compressing objects: 50% (77/154) remote: Compressing objects: 51% (79/154) remote: Compressing objects: 52% (81/154) remote: Compressing objects: 53% (82/154) remote: Compressing objects: 54% (84/154) remote: Compressing objects: 55% (85/154) remote: Compressing objects: 56% (87/154) remote: Compressing objects: 57% (88/154) remote: Compressing objects: 58% (90/154) remote: Compressing objects: 59% (91/154) remote: Compressing objects: 60% (93/154) remote: Compressing objects: 61% (94/154) remote: Compressing objects: 62% (96/154) remote: Compressing objects: 63% (98/154) remote: Compressing objects: 64% (99/154) remote: Compressing objects: 65% (101/154) remote: Compressing objects: 66% (102/154) remote: Compressing objects: 67% (104/154) remote: Compressing objects: 68% (105/154) remote: Compressing objects: 69% (107/154) remote: Compressing objects: 70% (108/154) remote: Compressing objects: 71% (110/154) remote: Compressing objects: 72% (111/154) remote: Compressing objects: 73% (113/154) remote: Compressing objects: 74% (114/154) remote: Compressing objects: 75% (116/154) remote: Compressing objects: 76% (118/154) remote: Compressing objects: 77% (119/154) remote: Compressing objects: 78% (121/154) remote: Compressing objects: 79% (122/154) remote: Compressing objects: 80% (124/154) remote: Compressing objects: 81% (125/154) remote: Compressing objects: 82% (127/154) remote: Compressing objects: 83% (128/154) remote: Compressing objects: 84% (130/154) remote: Compressing objects: 85% (131/154) remote: Compressing objects: 86% (133/154) remote: Compressing objects: 87% (134/154) remote: Compressing objects: 88% (136/154) remote: Compressing objects: 89% (138/154) remote: Compressing objects: 90% (139/154) remote: Compressing objects: 91% (141/154) remote: Compressing objects: 92% (142/154) remote: Compressing objects: 93% (144/154) remote: Compressing objects: 94% (145/154) remote: Compressing objects: 95% (147/154) remote: Compressing objects: 96% (148/154) remote: Compressing objects: 97% (150/154) remote: Compressing objects: 98% (151/154) remote: Compressing objects: 99% (153/154) remote: Compressing objects: 100% (154/154) remote: Compressing objects: 100% (154/154), done. Receiving objects: 0% (1/27321) Receiving objects: 1% (274/27321) Receiving objects: 2% (547/27321) Receiving objects: 3% (820/27321) Receiving objects: 4% (1093/27321) Receiving objects: 5% (1367/27321) Receiving objects: 6% (1640/27321) Receiving objects: 7% (1913/27321) Receiving objects: 8% (2186/27321) Receiving objects: 9% (2459/27321) Receiving objects: 10% (2733/27321) Receiving objects: 11% (3006/27321) Receiving objects: 12% (3279/27321) Receiving objects: 13% (3552/27321) Receiving objects: 14% (3825/27321) Receiving objects: 15% (4099/27321) Receiving objects: 16% (4372/27321) Receiving objects: 17% (4645/27321) Receiving objects: 18% (4918/27321) Receiving objects: 19% (5191/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 20% (5465/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 21% (5738/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 22% (6011/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 23% (6284/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 24% (6558/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 25% (6831/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 26% (7104/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 27% (7377/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 28% (7650/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 29% (7924/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 30% (8197/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 31% (8470/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 32% (8743/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 33% (9016/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 34% (9290/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 35% (9563/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 36% (9836/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 37% (10109/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 38% (10382/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 39% (10656/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 40% (10929/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 41% (11202/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 42% (11475/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 43% (11749/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 44% (12022/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 44% (12239/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 45% (12295/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 46% (12568/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 47% (12841/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 48% (13115/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 49% (13388/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 50% (13661/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 51% (13934/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 52% (14207/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 53% (14481/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 54% (14754/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 55% (15027/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 56% (15300/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 57% (15573/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 58% (15847/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 59% (16120/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 60% (16393/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 61% (16666/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 62% (16940/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 63% (17213/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 64% (17486/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 65% (17759/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 66% (18032/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 67% (18306/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 68% (18579/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 69% (18852/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 70% (19125/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 71% (19398/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 72% (19672/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 73% (19945/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 74% (20218/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 75% (20491/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 76% (20764/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 77% (21038/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 78% (21311/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 79% (21584/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 80% (21857/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 81% (22131/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 82% (22404/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 83% (22677/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 84% (22950/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 85% (23223/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 86% (23497/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 87% (23770/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 88% (24043/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 89% (24316/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 90% (24589/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 91% (24863/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 92% (25136/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 93% (25409/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 93% (25459/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 94% (25682/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 95% (25955/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 96% (26229/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 97% (26502/27321), 14.83 MiB | 6.87 MiB/s Receiving objects: 98% (26775/27321), 14.83 MiB | 6.87 MiB/s Receiving objects: 99% (27048/27321), 14.83 MiB | 6.87 MiB/s remote: Total 27321 (delta 205), reused 177 (delta 148), pack-reused 27014 (from 2) Receiving objects: 100% (27321/27321), 14.83 MiB | 6.87 MiB/s Receiving objects: 100% (27321/27321), 15.51 MiB | 6.77 MiB/s, done. Resolving deltas: 0% (0/21304) Resolving deltas: 1% (214/21304) Resolving deltas: 2% (427/21304) Resolving deltas: 3% (641/21304) Resolving deltas: 4% (853/21304) Resolving deltas: 5% (1067/21304) Resolving deltas: 6% (1280/21304) Resolving deltas: 7% (1492/21304) Resolving deltas: 8% (1705/21304) Resolving deltas: 9% (1918/21304) Resolving deltas: 10% (2131/21304) Resolving deltas: 11% (2344/21304) Resolving deltas: 12% (2557/21304) Resolving deltas: 13% (2772/21304) Resolving deltas: 14% (2985/21304) Resolving deltas: 15% (3196/21304) Resolving deltas: 16% (3410/21304) Resolving deltas: 17% (3622/21304) Resolving deltas: 18% (3835/21304) Resolving deltas: 19% (4048/21304) Resolving deltas: 20% (4261/21304) Resolving deltas: 21% (4474/21304) Resolving deltas: 22% (4687/21304) Resolving deltas: 23% (4902/21304) Resolving deltas: 24% (5113/21304) Resolving deltas: 25% (5327/21304) Resolving deltas: 26% (5541/21304) Resolving deltas: 27% (5753/21304) Resolving deltas: 28% (5966/21304) Resolving deltas: 29% (6181/21304) Resolving deltas: 30% (6392/21304) Resolving deltas: 31% (6605/21304) Resolving deltas: 32% (6818/21304) Resolving deltas: 33% (7032/21304) Resolving deltas: 34% (7247/21304) Resolving deltas: 35% (7457/21304) Resolving deltas: 36% (7670/21304) Resolving deltas: 37% (7883/21304) Resolving deltas: 38% (8096/21304) Resolving deltas: 39% (8309/21304) Resolving deltas: 40% (8524/21304) Resolving deltas: 41% (8735/21304) Resolving deltas: 42% (8948/21304) Resolving deltas: 43% (9161/21304) Resolving deltas: 44% (9374/21304) Resolving deltas: 45% (9589/21304) Resolving deltas: 46% (9800/21304) Resolving deltas: 47% (10013/21304) Resolving deltas: 48% (10226/21304) Resolving deltas: 49% (10440/21304) Resolving deltas: 50% (10652/21304) Resolving deltas: 51% (10866/21304) Resolving deltas: 52% (11079/21304) Resolving deltas: 53% (11292/21304) Resolving deltas: 54% (11505/21304) Resolving deltas: 55% (11718/21304) Resolving deltas: 56% (11931/21304) Resolving deltas: 57% (12150/21304) Resolving deltas: 58% (12357/21304) Resolving deltas: 59% (12570/21304) Resolving deltas: 60% (12783/21304) Resolving deltas: 61% (13000/21304) Resolving deltas: 62% (13215/21304) Resolving deltas: 63% (13422/21304) Resolving deltas: 64% (13635/21304) Resolving deltas: 65% (13848/21304) Resolving deltas: 66% (14062/21304) Resolving deltas: 67% (14274/21304) Resolving deltas: 68% (14487/21304) Resolving deltas: 69% (14700/21304) Resolving deltas: 70% (14913/21304) Resolving deltas: 71% (15128/21304) Resolving deltas: 72% (15339/21304) Resolving deltas: 73% (15554/21304) Resolving deltas: 74% (15765/21304) Resolving deltas: 75% (15979/21304) Resolving deltas: 76% (16192/21304) Resolving deltas: 77% (16405/21304) Resolving deltas: 78% (16618/21304) Resolving deltas: 79% (16832/21304) Resolving deltas: 80% (17044/21304) Resolving deltas: 81% (17257/21304) Resolving deltas: 82% (17470/21304) Resolving deltas: 83% (17683/21304) Resolving deltas: 84% (17896/21304) Resolving deltas: 85% (18111/21304) Resolving deltas: 86% (18322/21304) Resolving deltas: 87% (18535/21304) Resolving deltas: 88% (18748/21304) Resolving deltas: 89% (18961/21304) Resolving deltas: 90% (19174/21304) Resolving deltas: 91% (19387/21304) Resolving deltas: 92% (19601/21304) Resolving deltas: 93% (19814/21304) Resolving deltas: 94% (20026/21304) Resolving deltas: 95% (20239/21304) Resolving deltas: 96% (20452/21304) Resolving deltas: 97% (20666/21304) Resolving deltas: 98% (20878/21304) Resolving deltas: 99% (21092/21304) Resolving deltas: 100% (21304/21304) Resolving deltas: 100% (21304/21304), done. ==> Validating source files with sha512sums... lexicon ... Passed ]2;🔵 Container arch-nspawn-3445616 on centiskorch.felixc.at\==> Making package: dns-lexicon 3.18.0-2 (Sun Jan 5 14:02:29 2025) ==> Checking runtime dependencies... ==> Installing missing dependencies... [?25lresolving dependencies... looking for conflicting packages... warning: dependency cycle detected: warning: python-soupsieve will be installed before its python-beautifulsoup4 dependency Package (20) New Version Net Change Download Size extra/libyaml 0.2.5-3 0.16 MiB extra/mpdecimal 4.0.0-2 0.29 MiB core/python 3.13.1-1 108.57 MiB extra/python-cffi 1.17.1-2 1.35 MiB extra/python-charset-normalizer 3.4.1-1 0.44 MiB extra/python-filelock 3.16.1-2.1 0.13 MiB extra/python-idna 3.10-2 0.88 MiB extra/python-pycparser 2.22-3 1.69 MiB extra/python-requests-file 2.1.0-2 0.02 MiB extra/python-soupsieve 2.6-2 0.43 MiB extra/python-urllib3 2.3.0-1 1.26 MiB extra/python-zipp 3.21.0-2 0.08 MiB extra/python-beautifulsoup4 4.12.3-3 1.62 MiB extra/python-cryptography 44.0.0-1 5.12 MiB extra/python-dnspython 1:2.6.1-2 3.35 MiB 0.48 MiB extra/python-importlib-metadata 7.2.1-4.1 0.21 MiB 0.05 MiB extra/python-pyotp 2.9.0-3 0.10 MiB 0.03 MiB extra/python-requests 2.32.3-4.1 0.60 MiB extra/python-tldextract 5.1.3-2 0.44 MiB 0.11 MiB extra/python-yaml 6.0.2-2 0.91 MiB Total Download Size: 0.67 MiB Total Installed Size: 127.65 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... python-dnspython-1:2.6.1-2-any downloading... python-tldextract-5.1.3-2-any downloading... python-importlib-metadata-7.2.1-4.1-any downloading... python-pyotp-2.9.0-3-any downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... :: Processing package changes... installing mpdecimal... installing python... Optional dependencies for python python-setuptools: for building Python packages using tooling that is usually bundled with Python python-pip: for installing Python packages using tooling that is usually bundled with Python python-pipx: for installing Python software not packaged on Arch Linux sqlite: for a default database integration [installed] xz: for lzma [installed] tk: for tkinter installing python-soupsieve... installing python-beautifulsoup4... Optional dependencies for python-beautifulsoup4 python-cchardet: alternative to autodetect character encodings python-chardet: to autodetect character encodings python-lxml: alternative HTML parser python-html5lib: alternative HTML parser installing python-pycparser... installing python-cffi... Optional dependencies for python-cffi python-setuptools: "limited api" version checking in cffi.setuptools_ext installing python-cryptography... installing libyaml... installing python-yaml... installing python-charset-normalizer... installing python-idna... installing python-urllib3... Optional dependencies for python-urllib3 python-brotli: Brotli support python-brotlicffi: Brotli support python-h2: HTTP/2 support python-pysocks: SOCKS support python-zstandard: Zstandard support installing python-requests... Optional dependencies for python-requests python-chardet: alternative character encoding library python-pysocks: SOCKS proxy support installing python-requests-file... installing python-filelock... installing python-tldextract... installing python-zipp... installing python-importlib-metadata... installing python-pyotp... installing python-dnspython... Optional dependencies for python-dnspython python-cryptography: DNSSEC support [installed] python-requests-toolbelt: DoH support python-idna: support for updated IDNA 2008 [installed] python-curio: async support python-trio: async support python-sniffio: async support :: Running post-transaction hooks... (1/1) Arming ConditionNeedsUpdate... [?25h==> Checking buildtime dependencies... ==> Installing missing dependencies... [?25lresolving dependencies... looking for conflicting packages... Package (51) New Version Net Change Download Size core/dnssec-anchors 20190629-4 0.00 MiB 0.00 MiB extra/jemalloc 1:5.3.0-5 6.08 MiB core/libedit 20240808_3.1-1 0.25 MiB extra/libmaxminddb 1.11.0-1 0.04 MiB extra/liburcu 0.15.0-1 0.69 MiB extra/libuv 1.49.2-1 0.59 MiB extra/libxslt 1.1.42-2 0.77 MiB extra/perl-error 0.17029-7 0.04 MiB extra/perl-mailtools 2.22-1 0.10 MiB extra/perl-timedate 2.33-7 0.08 MiB extra/python-attrs 23.2.0-4 0.54 MiB extra/python-botocore 1.35.88-1 100.78 MiB 6.90 MiB extra/python-certifi 2024.12.14-1 0.02 MiB extra/python-click 8.1.7-4 1.18 MiB extra/python-dateutil 2.9.0-6.1 1.00 MiB extra/python-fastjsonschema 2.21.1-1 0.27 MiB extra/python-iniconfig 2.0.0-6 0.04 MiB extra/python-isodate 0.7.2-1 0.17 MiB extra/python-jmespath 1.0.1-5 0.21 MiB 0.04 MiB extra/python-lark-parser 1.2.2-3 1.24 MiB extra/python-lxml 5.3.0-2 4.52 MiB extra/python-markdown-it-py 3.0.0-4.1 0.68 MiB 0.14 MiB extra/python-mdurl 0.1.2-8 0.06 MiB extra/python-multidict 6.0.5-4 0.16 MiB extra/python-packaging 24.2-3 0.66 MiB extra/python-platformdirs 4.3.6-2 0.24 MiB extra/python-pluggy 1.5.0-3 0.20 MiB extra/python-prettytable 3.11.0-2 0.35 MiB 0.06 MiB extra/python-prompt_toolkit 3.0.48-2 4.39 MiB 0.70 MiB extra/python-pygments 2.18.0-3 14.14 MiB extra/python-pyproject-hooks 1.2.0-3 0.10 MiB extra/python-pytz 2024.2-2 0.15 MiB extra/python-requests-toolbelt 1.0.0-3 0.43 MiB extra/python-rich 13.9.4-3 3.13 MiB 0.52 MiB extra/python-s3transfer 0.10.4-1 0.91 MiB 0.14 MiB extra/python-six 1.16.0-10 0.12 MiB extra/python-typing_extensions 4.12.2-3 0.42 MiB extra/python-wcwidth 0.2.13-3 0.57 MiB extra/python-wrapt 1.16.0-4 0.25 MiB 0.05 MiB extra/python-yarl 1.9.4-4 0.31 MiB extra/bind 9.20.4-2 6.61 MiB 2.22 MiB extra/git 2.47.1-1 27.20 MiB extra/python-boto3 1.35.88-1 1.53 MiB 0.15 MiB extra/python-build 1.2.2-3 0.20 MiB extra/python-installer 0.7.0-10 0.17 MiB extra/python-localzone 0.9.8-6 0.07 MiB 0.02 MiB extra/python-poetry-core 1.9.1-1 1.28 MiB extra/python-pytest 1:8.3.4-1 3.92 MiB extra/python-softlayer 6.1.4-5 5.04 MiB 0.72 MiB extra/python-vcrpy 6.0.1-4 0.43 MiB 0.09 MiB extra/python-zeep 4.3.1-1 1.22 MiB 0.21 MiB Total Download Size: 11.96 MiB Total Installed Size: 193.53 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... python-botocore-1.35.88-1-any downloading... bind-9.20.4-2-riscv64 downloading... python-softlayer-6.1.4-5-any downloading... python-prompt_toolkit-3.0.48-2-any downloading... python-rich-13.9.4-3-any downloading... python-zeep-4.3.1-1-any downloading... python-boto3-1.35.88-1-any downloading... python-markdown-it-py-3.0.0-4.1-any downloading... python-s3transfer-0.10.4-1-any downloading... python-vcrpy-6.0.1-4-any downloading... python-prettytable-3.11.0-2-any downloading... python-wrapt-1.16.0-4-riscv64 downloading... python-jmespath-1.0.1-5-any downloading... python-localzone-0.9.8-6-any downloading... dnssec-anchors-20190629-4-any downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... :: Processing package changes... installing perl-error... installing perl-timedate... installing perl-mailtools... installing git... Optional dependencies for git tk: gitk and git gui openssh: ssh transport and crypto perl-libwww: git svn perl-term-readkey: git svn and interactive.singlekey setting perl-io-socket-ssl: git send-email TLS support perl-authen-sasl: git send-email TLS support perl-mediawiki-api: git mediawiki support perl-datetime-format-iso8601: git mediawiki support perl-lwp-protocol-https: git mediawiki https support perl-cgi: gitweb (web interface) support python: git svn & git p4 [installed] subversion: git svn org.freedesktop.secrets: keyring credential helper libsecret: libsecret credential helper [installed] installing python-packaging... installing python-pyproject-hooks... installing python-build... Optional dependencies for python-build python-pip: to use as the Python package installer (default) python-uv: to use as the Python package installer python-virtualenv: to use virtualenv for build isolation installing python-installer... installing python-fastjsonschema... installing python-typing_extensions... installing python-lark-parser... Optional dependencies for python-lark-parser python-atomicwrites: for atomic_cache python-regex: for regex support installing python-poetry-core... installing python-iniconfig... installing python-pluggy... installing python-pytest... installing python-wrapt... installing python-multidict... installing python-yarl... installing python-vcrpy... installing python-certifi... installing python-six... installing python-dateutil... installing python-jmespath... installing python-botocore... Optional dependencies for python-botocore python-awscrt installing python-s3transfer... Optional dependencies for python-s3transfer python-awscrt installing python-boto3... Optional dependencies for python-boto3 python-awscrt: AWS CRT S3 transfers installing python-localzone... installing python-wcwidth... installing python-prettytable... installing python-click... installing python-pygments... installing python-prompt_toolkit... installing python-mdurl... installing python-markdown-it-py... Optional dependencies for python-markdown-it-py python-mdit_py_plugins: core plugins python-linkify-it-py: linkify extension installing python-rich... installing python-softlayer... installing python-attrs... installing python-isodate... installing libxslt... Optional dependencies for libxslt python: Python bindings [installed] installing python-lxml... Optional dependencies for python-lxml python-beautifulsoup4: support for beautifulsoup parser to parse not well formed HTML [installed] python-cssselect: support for cssselect python-html5lib: support for html5lib parser python-lxml-docs: offline docs python-lxml-html-clean: enable htmlclean feature installing python-platformdirs... installing python-requests-toolbelt... installing python-pytz... installing python-zeep... installing dnssec-anchors... installing libedit... installing libmaxminddb... Optional dependencies for libmaxminddb geoip2-database: IP geolocation databases installing libuv... installing jemalloc... Optional dependencies for jemalloc perl: for jeprof [installed] installing liburcu... installing bind... :: Running post-transaction hooks... (1/5) Creating system user accounts... Creating group 'named' with GID 40. Creating user 'named' (BIND DNS Server) with UID 40 and GID 40. Creating group 'git' with GID 972. Creating user 'git' (git daemon user) with UID 972 and GID 972. (2/5) Reloading system manager configuration... Skipped: Current root is not booted. (3/5) Creating temporary files... (4/5) Arming ConditionNeedsUpdate... (5/5) Warn about old perl modules [?25h==> Retrieving sources... ==> WARNING: Skipping all source file integrity checks. ==> Extracting sources...  -> Creating working copy of lexicon git repo... Cloning into 'lexicon'... done. Updating files: 98% (2385/2411) Updating files: 99% (2387/2411) Updating files: 100% (2411/2411) Updating files: 100% (2411/2411), done. Switched to a new branch 'makepkg' ==> Starting build()... * Getting build dependencies for wheel... * Building wheel... Successfully built dns_lexicon-3.18.0-py3-none-any.whl ==> Starting check()... ============================= test session starts ============================== platform linux -- Python 3.13.1, pytest-8.3.4, pluggy-1.5.0 rootdir: /build/dns-lexicon/src/lexicon configfile: pyproject.toml collected 2422 items / 56 deselected / 2366 selected tests/providers/test_aliyun.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 1%] tests/providers/test_aurora.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 2%] tests/providers/test_auto.py .F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 3%] tests/providers/test_azure.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 4%] tests/providers/test_cloudflare.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 5%] tests/providers/test_cloudns.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 7%] tests/providers/test_cloudxns.py F.FFFFFFFFFFFFFFFssFFFFFFsFF [ 8%] tests/providers/test_conoha.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 9%] tests/providers/test_constellix.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 10%] tests/providers/test_ddns.py ssssssssssssssssssssssssssss [ 11%] tests/providers/test_digitalocean.py F.FFFFFFFFFFFFFFFssFFFFFFsFF [ 13%] tests/providers/test_dinahosting.py F.FFFFFFFFFFFFFsFssFFFFFFFFF [ 14%] tests/providers/test_directadmin.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 15%] tests/providers/test_dnsimple.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 16%] tests/providers/test_dnsmadeeasy.py F.FFFFFFFFFFFFFFFssFFFFFFsFF [ 17%] tests/providers/test_dnspark.py F.FFFFFFFFFsssFFFFFsFF [ 18%] tests/providers/test_dnspod.py F.FFFFFFFFFsssFFFFFsFF [ 19%] tests/providers/test_dnsservices.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 20%] tests/providers/test_dreamhost.py ...F.FFFFFFFFFFFFFsFssFFFFFFFFF [ 22%] tests/providers/test_duckdns.py .ssFsFFFsFFFFFssssssFF.F.sFss [ 23%] tests/providers/test_dynu.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 24%] tests/providers/test_easydns.py F.FFFFFFFFFFssFFFFFsFF [ 25%] tests/providers/test_easyname.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 26%] tests/providers/test_euserv.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 27%] tests/providers/test_exoscale.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 29%] tests/providers/test_flexibleengine.py F.FFFFFFFFFFFsFFsssssFsFFFFF [ 30%] tests/providers/test_gandi.py .................ss.........F.FFFFFFFFFFFF [ 31%] FFFssFFFFFFFFF [ 32%] tests/providers/test_gehirn.py F.FFFFFFFFFFssFFFFFFFF [ 33%] tests/providers/test_glesys.py F.FFFFFFFFFFssFFFFFsFF [ 34%] tests/providers/test_godaddy.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 35%] tests/providers/test_googleclouddns.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 36%] tests/providers/test_gransy.py FFFFFFFFFFFFFFFFFssFFFFFFFFF [ 37%] tests/providers/test_gratisdns.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 39%] tests/providers/test_henet.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 40%] tests/providers/test_hetzner.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 41%] tests/providers/test_hostingde.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 42%] tests/providers/test_hover.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 43%] tests/providers/test_infoblox.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 45%] tests/providers/test_infomaniak.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 46%] tests/providers/test_internetbs.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 47%] tests/providers/test_inwx.py .................ss......... [ 48%] tests/providers/test_joker.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 49%] tests/providers/test_linode.py F.FFFFFFFFFFFFFsFssFFFFFFFFF [ 51%] tests/providers/test_linode4.py F.FFFFFFFFFFFFFsFssFFFFFFFFF [ 52%] tests/providers/test_localzone.py ss...............ss......... [ 53%] tests/providers/test_luadns.py F.FsFFFFFFFFFFFFFssFFFFFFsFF [ 54%] tests/providers/test_memset.py F.FFFFFFFFFFFFFFFssFFFFFFsFF [ 55%] tests/providers/test_misaka.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 56%] tests/providers/test_mythicbeasts.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 58%] tests/providers/test_namecheap.py ssssssssssssssssssssssssssssssssssssss [ 59%] ssssssssssssssssss [ 60%] tests/providers/test_namecom.py F.FFFFFFFFFFFFFFFFFFFFFssFFFFFFFFFFFFFF. [ 62%] [ 62%] tests/providers/test_namesilo.py F.FFFFFFFFFFFFFFFssFFFFFFsFF [ 63%] tests/providers/test_netcup.py F.FFFFFFFFFFFFFsFssFFFFFFFFF [ 64%] tests/providers/test_nfsn.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 65%] tests/providers/test_njalla.py F.FFFFFFsFFFFFsFFssFFFFFFFFF [ 66%] tests/providers/test_nsone.py F.FFFFFFFFFFFFFsFssFFFFFFsFF [ 68%] tests/providers/test_onapp.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 69%] tests/providers/test_online.py F.FFFFFFFFFFsFFFFssFFFFFFFFF [ 70%] tests/providers/test_ovh.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 71%] tests/providers/test_plesk.py F.FFFFFFFFFFFFFsFssFFFFFFFFF [ 72%] tests/providers/test_pointhq.py F.FFFFFFFFFFFFFsFssFFFFFFsFF [ 74%] tests/providers/test_porkbun.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 75%] tests/providers/test_powerdns.py F.FFFFFFFFFFFFFFFssFFFFFFsFF [ 76%] tests/providers/test_rackspace.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 77%] tests/providers/test_rage4.py F.FFFFFFFFFFFFFFFssFFFFFssFs [ 78%] tests/providers/test_rcodezero.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 79%] tests/providers/test_route53.py FF..FFFFFFFFFFFFFFFssFFFFFFFFF [ 81%] tests/providers/test_safedns.py F.FsFFFFFFFFFFFsFssFFFFFFFFF [ 82%] tests/providers/test_sakuracloud.py F.FFFFFFFFsFssFFFFFFFF [ 83%] tests/providers/test_softlayer.py F.FFFFFFFFFFssFFFFFFFF [ 84%] tests/providers/test_timeweb.py F.ssFFFFFFFFFFFsFssFFFFFFFFF [ 85%] tests/providers/test_transip.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 86%] tests/providers/test_ultradns.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 87%] tests/providers/test_valuedomain.py ssssssssssssssssssssssssssss [ 88%] tests/providers/test_vercel.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 90%] tests/providers/test_vultr.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 91%] tests/providers/test_webgo.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 92%] tests/providers/test_wedos.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 93%] tests/providers/test_yandex.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 94%] tests/providers/test_yandexcloud.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 96%] tests/providers/test_zilore.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 97%] tests/providers/test_zonomi.py F.FFFFFFFFFFssFFFFFsFF [ 98%] tests/test_client.py ............ [ 98%] tests/test_config.py ......... [ 99%] tests/test_library.py ............ [ 99%] tests/test_output.py ..... [ 99%] tests/test_parser.py ..... [100%] =================================== FAILURES =================================== ________________ AliyunProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...mp=2025-01-05T19%3A03%3A16%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=S4h3tEl9NrSrwLivfNQ84%2BC%2FXc0%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A16%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=iJ9C9WM835Bsp5O79JDbN4eNZVU%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...tamp=2025-01-05T19%3A03%3A16%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=zD1phRw3XSE%2BEffLflBffKqeAiI%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A17%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=szCG64Nlnzm7L3VmzP3LdJPVYNo%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A17%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=eUlIU1hIjDcUYbeTbPTdFKMKdJI%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...tamp=2025-01-05T19%3A03%3A17%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=xg%2BOvg6rVfEqd3ImFYuJb7ATRSs%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...mp=2025-01-05T19%3A03%3A17%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=Bs0vF%2BoHw8zt%2FoCc7RYOvSpV5dM%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A18%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=AQL09uAYSGnQ6MmiDTWSywJQ2TI%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...tamp=2025-01-05T19%3A03%3A18%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=a4OfaGmWIJkfJ8NsuXy%2FaZ2lQY0%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...tamp=2025-01-05T19%3A03%3A18%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=rQUT7%2BD1VqHKv2b4EGetrwlYdj0%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A18%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=NQYJg0e2LdSs4YC0jxc60z3hSDE%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A18%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=KNzyhzpNeE3q62zxhh15fycsnC4%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A19%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=NhL8myHumP4VmUfmbIBkpVRvMU0%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A19%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=RRdGVREKGmS81822egJXylosGis%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A19%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=bt547ore9WU61qfJ6xauhddenYo%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...tamp=2025-01-05T19%3A03%3A19%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=z25NYXh287YW5%2F43AMQgOoW290M%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...mp=2025-01-05T19%3A03%3A20%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=4LjLz%2F8YIRK5PJ%2BXrtKOvQm9TdU%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A20%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=bzDFsCFxZl5yWhlRL0nqCBBQwhE%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...mp=2025-01-05T19%3A03%3A20%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=EEVT0thscl1R1Z3d%2FGqKG%2Bd6KYI%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...tamp=2025-01-05T19%3A03%3A20%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=LWY9Dp3v%2FRURS49DxNUC3qiHdlk%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...tamp=2025-01-05T19%3A03%3A20%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=HkP7MjhyXHYlTMmtTvv8%2F4rLEqY%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A21%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=uaVf1ZL6UClSQYkKgiThHFs6Tsk%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...=2025-01-05T19%3A03%3A21%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=ag%2BBU%2FHCyK%2Fy8WyHwu3wJf2YOs4%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A21%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=UMWw6QeNSsLsN2eLFPD6io5C4co%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...mp=2025-01-05T19%3A03%3A21%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=7wgG7f%2Bzql05sHnUtkB8K5s%2Fh1M%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ AuroraProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...RpZmVuQU1KS1RDdjQ5QUx5bnEvR1UrVWRTZE9hVU1UbkFVTXRuVzRCeE09', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...pyMEpOdFdiajczU2VKSStYckpIbktkZzR3UEtEMEtTRGxuTEh2K1p6cDA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...pyMEpOdFdiajczU2VKSStYckpIbktkZzR3UEtEMEtTRGxuTEh2K1p6cDA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...pyMEpOdFdiajczU2VKSStYckpIbktkZzR3UEtEMEtTRGxuTEh2K1p6cDA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...pyMEpOdFdiajczU2VKSStYckpIbktkZzR3UEtEMEtTRGxuTEh2K1p6cDA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...pyMEpOdFdiajczU2VKSStYckpIbktkZzR3UEtEMEtTRGxuTEh2K1p6cDA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...hSa2pXbjZCMDhZa1ZKeWNJMmpnTmppNlVjTXplcnhGaVMzSG1GU0lMRVE9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...hSa2pXbjZCMDhZa1ZKeWNJMmpnTmppNlVjTXplcnhGaVMzSG1GU0lMRVE9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...hSa2pXbjZCMDhZa1ZKeWNJMmpnTmppNlVjTXplcnhGaVMzSG1GU0lMRVE9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...hSa2pXbjZCMDhZa1ZKeWNJMmpnTmppNlVjTXplcnhGaVMzSG1GU0lMRVE9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...VRdmFZT2d5eU9GajdhNHQwSExzUmQ5UnRDNmZBdDlxUUhHcXFZcjViU3c9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...VRdmFZT2d5eU9GajdhNHQwSExzUmQ5UnRDNmZBdDlxUUhHcXFZcjViU3c9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...VRdmFZT2d5eU9GajdhNHQwSExzUmQ5UnRDNmZBdDlxUUhHcXFZcjViU3c9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...VRdmFZT2d5eU9GajdhNHQwSExzUmQ5UnRDNmZBdDlxUUhHcXFZcjViU3c9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...VRdmFZT2d5eU9GajdhNHQwSExzUmQ5UnRDNmZBdDlxUUhHcXFZcjViU3c9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...JrN3hVMEt6VlhnaEs5bFRqZE5aSVVBc1VMdzQreEtvZWlWQ3F1SVhqdnM9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...JrN3hVMEt6VlhnaEs5bFRqZE5aSVVBc1VMdzQreEtvZWlWQ3F1SVhqdnM9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...JrN3hVMEt6VlhnaEs5bFRqZE5aSVVBc1VMdzQreEtvZWlWQ3F1SVhqdnM9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...JrN3hVMEt6VlhnaEs5bFRqZE5aSVVBc1VMdzQreEtvZWlWQ3F1SVhqdnM9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...dOQ0d3U3o1TmdLVENIaVVxci8yQUVXcWxlNGZZSE15WnhqWXlQYlNRQjA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...dOQ0d3U3o1TmdLVENIaVVxci8yQUVXcWxlNGZZSE15WnhqWXlQYlNRQjA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...dOQ0d3U3o1TmdLVENIaVVxci8yQUVXcWxlNGZZSE15WnhqWXlQYlNRQjA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...dOQ0d3U3o1TmdLVENIaVVxci8yQUVXcWxlNGZZSE15WnhqWXlQYlNRQjA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...lFMHJzQmQzYk5PKzNZUGZibFh4NUR6S0k3V2E3MmJTcWFUYzIvREFTTm89', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...lFMHJzQmQzYk5PKzNZUGZibFh4NUR6S0k3V2E3MmJTcWFUYzIvREFTTm89', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _________________ AutoProviderTests.test_provider_authenticate _________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_list_records_after_setting_ttl __ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ____________________ AzureTests.test_provider_authenticate _____________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _____ AzureTests.test_provider_when_calling_list_records_after_setting_ttl _____ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ___ AzureTests.test_provider_when_calling_update_record_should_modify_record ___ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ______________ CloudflareProviderTests.test_provider_authenticate ______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapte