0es12
This commit is contained in:
parent
4fa654e177
commit
c4e6ea37a8
674
LICENSE
Normal file
674
LICENSE
Normal file
@ -0,0 +1,674 @@
|
||||
GNU GENERAL PUBLIC LICENSE
|
||||
Version 3, 29 June 2007
|
||||
|
||||
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
Preamble
|
||||
|
||||
The GNU General Public License is a free, copyleft license for
|
||||
software and other kinds of works.
|
||||
|
||||
The licenses for most software and other practical works are designed
|
||||
to take away your freedom to share and change the works. By contrast,
|
||||
the GNU General Public License is intended to guarantee your freedom to
|
||||
share and change all versions of a program--to make sure it remains free
|
||||
software for all its users. We, the Free Software Foundation, use the
|
||||
GNU General Public License for most of our software; it applies also to
|
||||
any other work released this way by its authors. You can apply it to
|
||||
your programs, too.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
them if you wish), that you receive source code or can get it if you
|
||||
want it, that you can change the software or use pieces of it in new
|
||||
free programs, and that you know you can do these things.
|
||||
|
||||
To protect your rights, we need to prevent others from denying you
|
||||
these rights or asking you to surrender the rights. Therefore, you have
|
||||
certain responsibilities if you distribute copies of the software, or if
|
||||
you modify it: responsibilities to respect the freedom of others.
|
||||
|
||||
For example, if you distribute copies of such a program, whether
|
||||
gratis or for a fee, you must pass on to the recipients the same
|
||||
freedoms that you received. You must make sure that they, too, receive
|
||||
or can get the source code. And you must show them these terms so they
|
||||
know their rights.
|
||||
|
||||
Developers that use the GNU GPL protect your rights with two steps:
|
||||
(1) assert copyright on the software, and (2) offer you this License
|
||||
giving you legal permission to copy, distribute and/or modify it.
|
||||
|
||||
For the developers' and authors' protection, the GPL clearly explains
|
||||
that there is no warranty for this free software. For both users' and
|
||||
authors' sake, the GPL requires that modified versions be marked as
|
||||
changed, so that their problems will not be attributed erroneously to
|
||||
authors of previous versions.
|
||||
|
||||
Some devices are designed to deny users access to install or run
|
||||
modified versions of the software inside them, although the manufacturer
|
||||
can do so. This is fundamentally incompatible with the aim of
|
||||
protecting users' freedom to change the software. The systematic
|
||||
pattern of such abuse occurs in the area of products for individuals to
|
||||
use, which is precisely where it is most unacceptable. Therefore, we
|
||||
have designed this version of the GPL to prohibit the practice for those
|
||||
products. If such problems arise substantially in other domains, we
|
||||
stand ready to extend this provision to those domains in future versions
|
||||
of the GPL, as needed to protect the freedom of users.
|
||||
|
||||
Finally, every program is threatened constantly by software patents.
|
||||
States should not allow patents to restrict development and use of
|
||||
software on general-purpose computers, but in those that do, we wish to
|
||||
avoid the special danger that patents applied to a free program could
|
||||
make it effectively proprietary. To prevent this, the GPL assures that
|
||||
patents cannot be used to render the program non-free.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
TERMS AND CONDITIONS
|
||||
|
||||
0. Definitions.
|
||||
|
||||
"This License" refers to version 3 of the GNU General Public License.
|
||||
|
||||
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||
works, such as semiconductor masks.
|
||||
|
||||
"The Program" refers to any copyrightable work licensed under this
|
||||
License. Each licensee is addressed as "you". "Licensees" and
|
||||
"recipients" may be individuals or organizations.
|
||||
|
||||
To "modify" a work means to copy from or adapt all or part of the work
|
||||
in a fashion requiring copyright permission, other than the making of an
|
||||
exact copy. The resulting work is called a "modified version" of the
|
||||
earlier work or a work "based on" the earlier work.
|
||||
|
||||
A "covered work" means either the unmodified Program or a work based
|
||||
on the Program.
|
||||
|
||||
To "propagate" a work means to do anything with it that, without
|
||||
permission, would make you directly or secondarily liable for
|
||||
infringement under applicable copyright law, except executing it on a
|
||||
computer or modifying a private copy. Propagation includes copying,
|
||||
distribution (with or without modification), making available to the
|
||||
public, and in some countries other activities as well.
|
||||
|
||||
To "convey" a work means any kind of propagation that enables other
|
||||
parties to make or receive copies. Mere interaction with a user through
|
||||
a computer network, with no transfer of a copy, is not conveying.
|
||||
|
||||
An interactive user interface displays "Appropriate Legal Notices"
|
||||
to the extent that it includes a convenient and prominently visible
|
||||
feature that (1) displays an appropriate copyright notice, and (2)
|
||||
tells the user that there is no warranty for the work (except to the
|
||||
extent that warranties are provided), that licensees may convey the
|
||||
work under this License, and how to view a copy of this License. If
|
||||
the interface presents a list of user commands or options, such as a
|
||||
menu, a prominent item in the list meets this criterion.
|
||||
|
||||
1. Source Code.
|
||||
|
||||
The "source code" for a work means the preferred form of the work
|
||||
for making modifications to it. "Object code" means any non-source
|
||||
form of a work.
|
||||
|
||||
A "Standard Interface" means an interface that either is an official
|
||||
standard defined by a recognized standards body, or, in the case of
|
||||
interfaces specified for a particular programming language, one that
|
||||
is widely used among developers working in that language.
|
||||
|
||||
The "System Libraries" of an executable work include anything, other
|
||||
than the work as a whole, that (a) is included in the normal form of
|
||||
packaging a Major Component, but which is not part of that Major
|
||||
Component, and (b) serves only to enable use of the work with that
|
||||
Major Component, or to implement a Standard Interface for which an
|
||||
implementation is available to the public in source code form. A
|
||||
"Major Component", in this context, means a major essential component
|
||||
(kernel, window system, and so on) of the specific operating system
|
||||
(if any) on which the executable work runs, or a compiler used to
|
||||
produce the work, or an object code interpreter used to run it.
|
||||
|
||||
The "Corresponding Source" for a work in object code form means all
|
||||
the source code needed to generate, install, and (for an executable
|
||||
work) run the object code and to modify the work, including scripts to
|
||||
control those activities. However, it does not include the work's
|
||||
System Libraries, or general-purpose tools or generally available free
|
||||
programs which are used unmodified in performing those activities but
|
||||
which are not part of the work. For example, Corresponding Source
|
||||
includes interface definition files associated with source files for
|
||||
the work, and the source code for shared libraries and dynamically
|
||||
linked subprograms that the work is specifically designed to require,
|
||||
such as by intimate data communication or control flow between those
|
||||
subprograms and other parts of the work.
|
||||
|
||||
The Corresponding Source need not include anything that users
|
||||
can regenerate automatically from other parts of the Corresponding
|
||||
Source.
|
||||
|
||||
The Corresponding Source for a work in source code form is that
|
||||
same work.
|
||||
|
||||
2. Basic Permissions.
|
||||
|
||||
All rights granted under this License are granted for the term of
|
||||
copyright on the Program, and are irrevocable provided the stated
|
||||
conditions are met. This License explicitly affirms your unlimited
|
||||
permission to run the unmodified Program. The output from running a
|
||||
covered work is covered by this License only if the output, given its
|
||||
content, constitutes a covered work. This License acknowledges your
|
||||
rights of fair use or other equivalent, as provided by copyright law.
|
||||
|
||||
You may make, run and propagate covered works that you do not
|
||||
convey, without conditions so long as your license otherwise remains
|
||||
in force. You may convey covered works to others for the sole purpose
|
||||
of having them make modifications exclusively for you, or provide you
|
||||
with facilities for running those works, provided that you comply with
|
||||
the terms of this License in conveying all material for which you do
|
||||
not control copyright. Those thus making or running the covered works
|
||||
for you must do so exclusively on your behalf, under your direction
|
||||
and control, on terms that prohibit them from making any copies of
|
||||
your copyrighted material outside their relationship with you.
|
||||
|
||||
Conveying under any other circumstances is permitted solely under
|
||||
the conditions stated below. Sublicensing is not allowed; section 10
|
||||
makes it unnecessary.
|
||||
|
||||
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||
|
||||
No covered work shall be deemed part of an effective technological
|
||||
measure under any applicable law fulfilling obligations under article
|
||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||
similar laws prohibiting or restricting circumvention of such
|
||||
measures.
|
||||
|
||||
When you convey a covered work, you waive any legal power to forbid
|
||||
circumvention of technological measures to the extent such circumvention
|
||||
is effected by exercising rights under this License with respect to
|
||||
the covered work, and you disclaim any intention to limit operation or
|
||||
modification of the work as a means of enforcing, against the work's
|
||||
users, your or third parties' legal rights to forbid circumvention of
|
||||
technological measures.
|
||||
|
||||
4. Conveying Verbatim Copies.
|
||||
|
||||
You may convey verbatim copies of the Program's source code as you
|
||||
receive it, in any medium, provided that you conspicuously and
|
||||
appropriately publish on each copy an appropriate copyright notice;
|
||||
keep intact all notices stating that this License and any
|
||||
non-permissive terms added in accord with section 7 apply to the code;
|
||||
keep intact all notices of the absence of any warranty; and give all
|
||||
recipients a copy of this License along with the Program.
|
||||
|
||||
You may charge any price or no price for each copy that you convey,
|
||||
and you may offer support or warranty protection for a fee.
|
||||
|
||||
5. Conveying Modified Source Versions.
|
||||
|
||||
You may convey a work based on the Program, or the modifications to
|
||||
produce it from the Program, in the form of source code under the
|
||||
terms of section 4, provided that you also meet all of these conditions:
|
||||
|
||||
a) The work must carry prominent notices stating that you modified
|
||||
it, and giving a relevant date.
|
||||
|
||||
b) The work must carry prominent notices stating that it is
|
||||
released under this License and any conditions added under section
|
||||
7. This requirement modifies the requirement in section 4 to
|
||||
"keep intact all notices".
|
||||
|
||||
c) You must license the entire work, as a whole, under this
|
||||
License to anyone who comes into possession of a copy. This
|
||||
License will therefore apply, along with any applicable section 7
|
||||
additional terms, to the whole of the work, and all its parts,
|
||||
regardless of how they are packaged. This License gives no
|
||||
permission to license the work in any other way, but it does not
|
||||
invalidate such permission if you have separately received it.
|
||||
|
||||
d) If the work has interactive user interfaces, each must display
|
||||
Appropriate Legal Notices; however, if the Program has interactive
|
||||
interfaces that do not display Appropriate Legal Notices, your
|
||||
work need not make them do so.
|
||||
|
||||
A compilation of a covered work with other separate and independent
|
||||
works, which are not by their nature extensions of the covered work,
|
||||
and which are not combined with it such as to form a larger program,
|
||||
in or on a volume of a storage or distribution medium, is called an
|
||||
"aggregate" if the compilation and its resulting copyright are not
|
||||
used to limit the access or legal rights of the compilation's users
|
||||
beyond what the individual works permit. Inclusion of a covered work
|
||||
in an aggregate does not cause this License to apply to the other
|
||||
parts of the aggregate.
|
||||
|
||||
6. Conveying Non-Source Forms.
|
||||
|
||||
You may convey a covered work in object code form under the terms
|
||||
of sections 4 and 5, provided that you also convey the
|
||||
machine-readable Corresponding Source under the terms of this License,
|
||||
in one of these ways:
|
||||
|
||||
a) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by the
|
||||
Corresponding Source fixed on a durable physical medium
|
||||
customarily used for software interchange.
|
||||
|
||||
b) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by a
|
||||
written offer, valid for at least three years and valid for as
|
||||
long as you offer spare parts or customer support for that product
|
||||
model, to give anyone who possesses the object code either (1) a
|
||||
copy of the Corresponding Source for all the software in the
|
||||
product that is covered by this License, on a durable physical
|
||||
medium customarily used for software interchange, for a price no
|
||||
more than your reasonable cost of physically performing this
|
||||
conveying of source, or (2) access to copy the
|
||||
Corresponding Source from a network server at no charge.
|
||||
|
||||
c) Convey individual copies of the object code with a copy of the
|
||||
written offer to provide the Corresponding Source. This
|
||||
alternative is allowed only occasionally and noncommercially, and
|
||||
only if you received the object code with such an offer, in accord
|
||||
with subsection 6b.
|
||||
|
||||
d) Convey the object code by offering access from a designated
|
||||
place (gratis or for a charge), and offer equivalent access to the
|
||||
Corresponding Source in the same way through the same place at no
|
||||
further charge. You need not require recipients to copy the
|
||||
Corresponding Source along with the object code. If the place to
|
||||
copy the object code is a network server, the Corresponding Source
|
||||
may be on a different server (operated by you or a third party)
|
||||
that supports equivalent copying facilities, provided you maintain
|
||||
clear directions next to the object code saying where to find the
|
||||
Corresponding Source. Regardless of what server hosts the
|
||||
Corresponding Source, you remain obligated to ensure that it is
|
||||
available for as long as needed to satisfy these requirements.
|
||||
|
||||
e) Convey the object code using peer-to-peer transmission, provided
|
||||
you inform other peers where the object code and Corresponding
|
||||
Source of the work are being offered to the general public at no
|
||||
charge under subsection 6d.
|
||||
|
||||
A separable portion of the object code, whose source code is excluded
|
||||
from the Corresponding Source as a System Library, need not be
|
||||
included in conveying the object code work.
|
||||
|
||||
A "User Product" is either (1) a "consumer product", which means any
|
||||
tangible personal property which is normally used for personal, family,
|
||||
or household purposes, or (2) anything designed or sold for incorporation
|
||||
into a dwelling. In determining whether a product is a consumer product,
|
||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||
product received by a particular user, "normally used" refers to a
|
||||
typical or common use of that class of product, regardless of the status
|
||||
of the particular user or of the way in which the particular user
|
||||
actually uses, or expects or is expected to use, the product. A product
|
||||
is a consumer product regardless of whether the product has substantial
|
||||
commercial, industrial or non-consumer uses, unless such uses represent
|
||||
the only significant mode of use of the product.
|
||||
|
||||
"Installation Information" for a User Product means any methods,
|
||||
procedures, authorization keys, or other information required to install
|
||||
and execute modified versions of a covered work in that User Product from
|
||||
a modified version of its Corresponding Source. The information must
|
||||
suffice to ensure that the continued functioning of the modified object
|
||||
code is in no case prevented or interfered with solely because
|
||||
modification has been made.
|
||||
|
||||
If you convey an object code work under this section in, or with, or
|
||||
specifically for use in, a User Product, and the conveying occurs as
|
||||
part of a transaction in which the right of possession and use of the
|
||||
User Product is transferred to the recipient in perpetuity or for a
|
||||
fixed term (regardless of how the transaction is characterized), the
|
||||
Corresponding Source conveyed under this section must be accompanied
|
||||
by the Installation Information. But this requirement does not apply
|
||||
if neither you nor any third party retains the ability to install
|
||||
modified object code on the User Product (for example, the work has
|
||||
been installed in ROM).
|
||||
|
||||
The requirement to provide Installation Information does not include a
|
||||
requirement to continue to provide support service, warranty, or updates
|
||||
for a work that has been modified or installed by the recipient, or for
|
||||
the User Product in which it has been modified or installed. Access to a
|
||||
network may be denied when the modification itself materially and
|
||||
adversely affects the operation of the network or violates the rules and
|
||||
protocols for communication across the network.
|
||||
|
||||
Corresponding Source conveyed, and Installation Information provided,
|
||||
in accord with this section must be in a format that is publicly
|
||||
documented (and with an implementation available to the public in
|
||||
source code form), and must require no special password or key for
|
||||
unpacking, reading or copying.
|
||||
|
||||
7. Additional Terms.
|
||||
|
||||
"Additional permissions" are terms that supplement the terms of this
|
||||
License by making exceptions from one or more of its conditions.
|
||||
Additional permissions that are applicable to the entire Program shall
|
||||
be treated as though they were included in this License, to the extent
|
||||
that they are valid under applicable law. If additional permissions
|
||||
apply only to part of the Program, that part may be used separately
|
||||
under those permissions, but the entire Program remains governed by
|
||||
this License without regard to the additional permissions.
|
||||
|
||||
When you convey a copy of a covered work, you may at your option
|
||||
remove any additional permissions from that copy, or from any part of
|
||||
it. (Additional permissions may be written to require their own
|
||||
removal in certain cases when you modify the work.) You may place
|
||||
additional permissions on material, added by you to a covered work,
|
||||
for which you have or can give appropriate copyright permission.
|
||||
|
||||
Notwithstanding any other provision of this License, for material you
|
||||
add to a covered work, you may (if authorized by the copyright holders of
|
||||
that material) supplement the terms of this License with terms:
|
||||
|
||||
a) Disclaiming warranty or limiting liability differently from the
|
||||
terms of sections 15 and 16 of this License; or
|
||||
|
||||
b) Requiring preservation of specified reasonable legal notices or
|
||||
author attributions in that material or in the Appropriate Legal
|
||||
Notices displayed by works containing it; or
|
||||
|
||||
c) Prohibiting misrepresentation of the origin of that material, or
|
||||
requiring that modified versions of such material be marked in
|
||||
reasonable ways as different from the original version; or
|
||||
|
||||
d) Limiting the use for publicity purposes of names of licensors or
|
||||
authors of the material; or
|
||||
|
||||
e) Declining to grant rights under trademark law for use of some
|
||||
trade names, trademarks, or service marks; or
|
||||
|
||||
f) Requiring indemnification of licensors and authors of that
|
||||
material by anyone who conveys the material (or modified versions of
|
||||
it) with contractual assumptions of liability to the recipient, for
|
||||
any liability that these contractual assumptions directly impose on
|
||||
those licensors and authors.
|
||||
|
||||
All other non-permissive additional terms are considered "further
|
||||
restrictions" within the meaning of section 10. If the Program as you
|
||||
received it, or any part of it, contains a notice stating that it is
|
||||
governed by this License along with a term that is a further
|
||||
restriction, you may remove that term. If a license document contains
|
||||
a further restriction but permits relicensing or conveying under this
|
||||
License, you may add to a covered work material governed by the terms
|
||||
of that license document, provided that the further restriction does
|
||||
not survive such relicensing or conveying.
|
||||
|
||||
If you add terms to a covered work in accord with this section, you
|
||||
must place, in the relevant source files, a statement of the
|
||||
additional terms that apply to those files, or a notice indicating
|
||||
where to find the applicable terms.
|
||||
|
||||
Additional terms, permissive or non-permissive, may be stated in the
|
||||
form of a separately written license, or stated as exceptions;
|
||||
the above requirements apply either way.
|
||||
|
||||
8. Termination.
|
||||
|
||||
You may not propagate or modify a covered work except as expressly
|
||||
provided under this License. Any attempt otherwise to propagate or
|
||||
modify it is void, and will automatically terminate your rights under
|
||||
this License (including any patent licenses granted under the third
|
||||
paragraph of section 11).
|
||||
|
||||
However, if you cease all violation of this License, then your
|
||||
license from a particular copyright holder is reinstated (a)
|
||||
provisionally, unless and until the copyright holder explicitly and
|
||||
finally terminates your license, and (b) permanently, if the copyright
|
||||
holder fails to notify you of the violation by some reasonable means
|
||||
prior to 60 days after the cessation.
|
||||
|
||||
Moreover, your license from a particular copyright holder is
|
||||
reinstated permanently if the copyright holder notifies you of the
|
||||
violation by some reasonable means, this is the first time you have
|
||||
received notice of violation of this License (for any work) from that
|
||||
copyright holder, and you cure the violation prior to 30 days after
|
||||
your receipt of the notice.
|
||||
|
||||
Termination of your rights under this section does not terminate the
|
||||
licenses of parties who have received copies or rights from you under
|
||||
this License. If your rights have been terminated and not permanently
|
||||
reinstated, you do not qualify to receive new licenses for the same
|
||||
material under section 10.
|
||||
|
||||
9. Acceptance Not Required for Having Copies.
|
||||
|
||||
You are not required to accept this License in order to receive or
|
||||
run a copy of the Program. Ancillary propagation of a covered work
|
||||
occurring solely as a consequence of using peer-to-peer transmission
|
||||
to receive a copy likewise does not require acceptance. However,
|
||||
nothing other than this License grants you permission to propagate or
|
||||
modify any covered work. These actions infringe copyright if you do
|
||||
not accept this License. Therefore, by modifying or propagating a
|
||||
covered work, you indicate your acceptance of this License to do so.
|
||||
|
||||
10. Automatic Licensing of Downstream Recipients.
|
||||
|
||||
Each time you convey a covered work, the recipient automatically
|
||||
receives a license from the original licensors, to run, modify and
|
||||
propagate that work, subject to this License. You are not responsible
|
||||
for enforcing compliance by third parties with this License.
|
||||
|
||||
An "entity transaction" is a transaction transferring control of an
|
||||
organization, or substantially all assets of one, or subdividing an
|
||||
organization, or merging organizations. If propagation of a covered
|
||||
work results from an entity transaction, each party to that
|
||||
transaction who receives a copy of the work also receives whatever
|
||||
licenses to the work the party's predecessor in interest had or could
|
||||
give under the previous paragraph, plus a right to possession of the
|
||||
Corresponding Source of the work from the predecessor in interest, if
|
||||
the predecessor has it or can get it with reasonable efforts.
|
||||
|
||||
You may not impose any further restrictions on the exercise of the
|
||||
rights granted or affirmed under this License. For example, you may
|
||||
not impose a license fee, royalty, or other charge for exercise of
|
||||
rights granted under this License, and you may not initiate litigation
|
||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||
any patent claim is infringed by making, using, selling, offering for
|
||||
sale, or importing the Program or any portion of it.
|
||||
|
||||
11. Patents.
|
||||
|
||||
A "contributor" is a copyright holder who authorizes use under this
|
||||
License of the Program or a work on which the Program is based. The
|
||||
work thus licensed is called the contributor's "contributor version".
|
||||
|
||||
A contributor's "essential patent claims" are all patent claims
|
||||
owned or controlled by the contributor, whether already acquired or
|
||||
hereafter acquired, that would be infringed by some manner, permitted
|
||||
by this License, of making, using, or selling its contributor version,
|
||||
but do not include claims that would be infringed only as a
|
||||
consequence of further modification of the contributor version. For
|
||||
purposes of this definition, "control" includes the right to grant
|
||||
patent sublicenses in a manner consistent with the requirements of
|
||||
this License.
|
||||
|
||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||
patent license under the contributor's essential patent claims, to
|
||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||
propagate the contents of its contributor version.
|
||||
|
||||
In the following three paragraphs, a "patent license" is any express
|
||||
agreement or commitment, however denominated, not to enforce a patent
|
||||
(such as an express permission to practice a patent or covenant not to
|
||||
sue for patent infringement). To "grant" such a patent license to a
|
||||
party means to make such an agreement or commitment not to enforce a
|
||||
patent against the party.
|
||||
|
||||
If you convey a covered work, knowingly relying on a patent license,
|
||||
and the Corresponding Source of the work is not available for anyone
|
||||
to copy, free of charge and under the terms of this License, through a
|
||||
publicly available network server or other readily accessible means,
|
||||
then you must either (1) cause the Corresponding Source to be so
|
||||
available, or (2) arrange to deprive yourself of the benefit of the
|
||||
patent license for this particular work, or (3) arrange, in a manner
|
||||
consistent with the requirements of this License, to extend the patent
|
||||
license to downstream recipients. "Knowingly relying" means you have
|
||||
actual knowledge that, but for the patent license, your conveying the
|
||||
covered work in a country, or your recipient's use of the covered work
|
||||
in a country, would infringe one or more identifiable patents in that
|
||||
country that you have reason to believe are valid.
|
||||
|
||||
If, pursuant to or in connection with a single transaction or
|
||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||
covered work, and grant a patent license to some of the parties
|
||||
receiving the covered work authorizing them to use, propagate, modify
|
||||
or convey a specific copy of the covered work, then the patent license
|
||||
you grant is automatically extended to all recipients of the covered
|
||||
work and works based on it.
|
||||
|
||||
A patent license is "discriminatory" if it does not include within
|
||||
the scope of its coverage, prohibits the exercise of, or is
|
||||
conditioned on the non-exercise of one or more of the rights that are
|
||||
specifically granted under this License. You may not convey a covered
|
||||
work if you are a party to an arrangement with a third party that is
|
||||
in the business of distributing software, under which you make payment
|
||||
to the third party based on the extent of your activity of conveying
|
||||
the work, and under which the third party grants, to any of the
|
||||
parties who would receive the covered work from you, a discriminatory
|
||||
patent license (a) in connection with copies of the covered work
|
||||
conveyed by you (or copies made from those copies), or (b) primarily
|
||||
for and in connection with specific products or compilations that
|
||||
contain the covered work, unless you entered into that arrangement,
|
||||
or that patent license was granted, prior to 28 March 2007.
|
||||
|
||||
Nothing in this License shall be construed as excluding or limiting
|
||||
any implied license or other defenses to infringement that may
|
||||
otherwise be available to you under applicable patent law.
|
||||
|
||||
12. No Surrender of Others' Freedom.
|
||||
|
||||
If conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot convey a
|
||||
covered work so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you may
|
||||
not convey it at all. For example, if you agree to terms that obligate you
|
||||
to collect a royalty for further conveying from those to whom you convey
|
||||
the Program, the only way you could satisfy both those terms and this
|
||||
License would be to refrain entirely from conveying the Program.
|
||||
|
||||
13. Use with the GNU Affero General Public License.
|
||||
|
||||
Notwithstanding any other provision of this License, you have
|
||||
permission to link or combine any covered work with a work licensed
|
||||
under version 3 of the GNU Affero General Public License into a single
|
||||
combined work, and to convey the resulting work. The terms of this
|
||||
License will continue to apply to the part which is the covered work,
|
||||
but the special requirements of the GNU Affero General Public License,
|
||||
section 13, concerning interaction through a network will apply to the
|
||||
combination as such.
|
||||
|
||||
14. Revised Versions of this License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions of
|
||||
the GNU General Public License from time to time. Such new versions will
|
||||
be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Program specifies that a certain numbered version of the GNU General
|
||||
Public License "or any later version" applies to it, you have the
|
||||
option of following the terms and conditions either of that numbered
|
||||
version or of any later version published by the Free Software
|
||||
Foundation. If the Program does not specify a version number of the
|
||||
GNU General Public License, you may choose any version ever published
|
||||
by the Free Software Foundation.
|
||||
|
||||
If the Program specifies that a proxy can decide which future
|
||||
versions of the GNU General Public License can be used, that proxy's
|
||||
public statement of acceptance of a version permanently authorizes you
|
||||
to choose that version for the Program.
|
||||
|
||||
Later license versions may give you additional or different
|
||||
permissions. However, no additional obligations are imposed on any
|
||||
author or copyright holder as a result of your choosing to follow a
|
||||
later version.
|
||||
|
||||
15. Disclaimer of Warranty.
|
||||
|
||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||
|
||||
16. Limitation of Liability.
|
||||
|
||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||
SUCH DAMAGES.
|
||||
|
||||
17. Interpretation of Sections 15 and 16.
|
||||
|
||||
If the disclaimer of warranty and limitation of liability provided
|
||||
above cannot be given local legal effect according to their terms,
|
||||
reviewing courts shall apply local law that most closely approximates
|
||||
an absolute waiver of all civil liability in connection with the
|
||||
Program, unless a warranty or assumption of liability accompanies a
|
||||
copy of the Program in return for a fee.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
state the exclusion of warranty; and each file should have at least
|
||||
the "copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
<one line to give the program's name and a brief idea of what it does.>
|
||||
Copyright (C) <year> <name of author>
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation, either version 3 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU General Public License
|
||||
along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If the program does terminal interaction, make it output a short
|
||||
notice like this when it starts in an interactive mode:
|
||||
|
||||
<program> Copyright (C) <year> <name of author>
|
||||
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
|
||||
This is free software, and you are welcome to redistribute it
|
||||
under certain conditions; type `show c' for details.
|
||||
|
||||
The hypothetical commands `show w' and `show c' should show the appropriate
|
||||
parts of the General Public License. Of course, your program's commands
|
||||
might be different; for a GUI interface, you would use an "about box".
|
||||
|
||||
You should also get your employer (if you work as a programmer) or school,
|
||||
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||
For more information on this, and how to apply and follow the GNU GPL, see
|
||||
<https://www.gnu.org/licenses/>.
|
||||
|
||||
The GNU General Public License does not permit incorporating your program
|
||||
into proprietary programs. If your program is a subroutine library, you
|
||||
may consider it more useful to permit linking proprietary applications with
|
||||
the library. If this is what you want to do, use the GNU Lesser General
|
||||
Public License instead of this License. But first, please read
|
||||
<https://www.gnu.org/licenses/why-not-lgpl.html>.
|
123
README.md
123
README.md
@ -1,2 +1,121 @@
|
||||
# NETFLIX-DL-6.1.0
|
||||
Code to do NF in HDR,HEVC,HPL,MPL
|
||||
<p align="center">
|
||||
<img width="200" src="https://www.freepnglogos.com/uploads/netflix-logo-0.png">
|
||||
</p>
|
||||
|
||||
<h1> Netflix-DL 6.0 |HDR-HEVC-MPL-HPL Working| ! <img src = "https://raw.githubusercontent.com/MartinHeinz/MartinHeinz/master/wave.gif" width = 30px> </h1>
|
||||
<p align='center'>
|
||||
</p>
|
||||
|
||||
# Use Python 3.9.8 & install
|
||||
|
||||
## Quick start
|
||||
```
|
||||
install.requirements.bat
|
||||
```
|
||||
```
|
||||
For Chrome install the following addon
|
||||
https://chrome.google.com/webstore/detail/get-cookiestxt/bgaddhkoddajcdgocldbbfleckgcbcid?hl=en
|
||||
|
||||
```
|
||||
```
|
||||
get cookies.txt from netflix site you should be login in the browser and put it in as cookies.txt in
|
||||
\configs\Cookies
|
||||
```
|
||||
```
|
||||
Now edit config.py in following path
|
||||
\configs\config.py
|
||||
line no .108-109
|
||||
"email": "xxx",
|
||||
"password": "xxxx",
|
||||
put your login details here
|
||||
```
|
||||
```
|
||||
Now enjoy using it
|
||||
```
|
||||
Examples
|
||||
|
||||
netflix.pyc 81478916 -q 1080 --hdr
|
||||
|
||||
netflix.pyc 81478916 -q 1080 --hevc
|
||||
|
||||
netflix.pyc 81478916 -q 1080 --main
|
||||
|
||||
netflix.pyc 81478916 -q 1080 --high
|
||||
|
||||
```
|
||||
USAGE:
|
||||
-h,--help Display the complete parameter setting help file and exit
|
||||
-q <number> video resolution, the highest (1080) is selected by default, optional: 480, 720, 1080, etc.
|
||||
-o <directory path>
|
||||
Download temporary folder
|
||||
-f <directory path>
|
||||
mkv mixed stream output folder, do not specify the default output to download temporary folder
|
||||
-s <number> Season number (Season) does not specify that all seasons are downloaded by default
|
||||
-e <number> The number of episodes (Episode) does not specify the default download complete episode
|
||||
"-e 1" download episode 1;
|
||||
"-e 1-7" download episodes 1-7;
|
||||
"-e 2,5" Download episodes 2 and 5
|
||||
-p, --prompt
|
||||
Interactively prompt to enter yes/no before downloading
|
||||
--AD language code, --alang language code
|
||||
Specify the audio track language, download the highest bit rate audio track in the original language (Original) by default
|
||||
Language code location: "/helpers/Muxer.py"
|
||||
--slang language code
|
||||
Specify the subtitle language, download all language subtitles by default,
|
||||
For example, "--slang zhoS zhoT" specifies simplified Chinese and traditional Chinese subtitles
|
||||
--flang language code
|
||||
Specify the "scene subtitle" language (Force Subtitle)
|
||||
--all-audios
|
||||
Download all language audio tracks
|
||||
--all-forced
|
||||
Download "scene subtitles" in all languages
|
||||
--audio-bitrate <number>
|
||||
Specify the audio bit rate, download the highest bit rate audio track by default, optional: 128, 256, 448, etc.
|
||||
--aformat-2c, --audio-format-2ch
|
||||
Specify to download 2.0 channel audio tracks
|
||||
--aformat-51ch, --audio-format-51ch
|
||||
Specify to download 5.1 channel audio tracks
|
||||
--keep keep the original material files after mixed stream mkv, delete by default
|
||||
-keys, --license
|
||||
Only output widevine key to the console and exit
|
||||
--no-aria2c Do not call the aria2c downloader, use the Python downloader, and use aria2c by default
|
||||
This parameter is not recommended
|
||||
--nv do not download the video (Video)
|
||||
--na do not download audio (Audio)
|
||||
--ns do not download subtitles (Subtitle)
|
||||
|
||||
Additional configuration file parameters (Manifest):
|
||||
--main specifies H.264 Main
|
||||
--high Specify H.264 High
|
||||
--hevc specify H.265
|
||||
--hdr specify H.265 HDR
|
||||
--check Compare the quality of H.264 Main/H.264 High
|
||||
```
|
||||
|
||||
<p align="center">
|
||||
<img width="200" src="https://github.com/Kathryn-Jie/Kathryn-Jie/blob/main/kathryn.png">
|
||||
</p>
|
||||
|
||||
<h1> Hello Fellow < Developers/ >! </h1>
|
||||
<p align='center'>
|
||||
</p>
|
||||
|
||||
|
||||
|
||||
<div size='20px'> Hi! My name is WVDUMP. I am Leaking the scripts to punish few idiots :smile:
|
||||
</div>
|
||||
|
||||
<h2> About Me </h2>
|
||||
|
||||
<img width="55%" align="right" alt="Github" src="https://raw.githubusercontent.com/onimur/.github/master/.resources/git-header.svg" />
|
||||
|
||||
|
||||
- 👯 Sharing is caring
|
||||
|
||||
|
||||
- ⚡ CDM L1 BUY it from wvfuck@protonmail.com ⚡
|
||||
|
||||
|
||||
<br>
|
||||
<br>
|
||||
<br>
|
||||
|
1
configs/Cookies/cookies.txt
Normal file
1
configs/Cookies/cookies.txt
Normal file
@ -0,0 +1 @@
|
||||
{\rtf1}
|
0
configs/Tokens/test.txt
Normal file
0
configs/Tokens/test.txt
Normal file
0
configs/Tokens/txt.txt
Normal file
0
configs/Tokens/txt.txt
Normal file
0
configs/__init__.py
Normal file
0
configs/__init__.py
Normal file
163
configs/config.py
Normal file
163
configs/config.py
Normal file
@ -0,0 +1,163 @@
|
||||
import sys, os, random, string, platform
|
||||
from os.path import dirname
|
||||
from os.path import join
|
||||
from pywidevine.cdm import cdm, deviceconfig
|
||||
|
||||
dirPath = dirname(dirname(__file__)).replace("\\", "/")
|
||||
|
||||
class utils:
|
||||
def __init__(self):
|
||||
self.dir = dirPath
|
||||
|
||||
def random_hex(self, length: int) -> str:
|
||||
"""return {length} of random string"""
|
||||
return "".join(random.choice("0123456789ABCDEF") for _ in range(length))
|
||||
|
||||
utils_ = utils()
|
||||
|
||||
#####################################(DEVICES)#####################################
|
||||
|
||||
devices_dict = {
|
||||
"android_general": deviceconfig.device_android_general,
|
||||
}
|
||||
|
||||
DEVICES = {
|
||||
"NETFLIX-MANIFEST": devices_dict["android_general"],
|
||||
"NETFLIX-LICENSE": devices_dict["android_general"],
|
||||
}
|
||||
|
||||
#####################################(MUXER)#####################################
|
||||
|
||||
MUXER = {
|
||||
"muxer_file": f"{dirPath}/bin/muxer.json",
|
||||
"mkv_folder": None,
|
||||
"DEFAULT": False, # to use the normal renaming. EX: Stranger Things S01E01 [1080p].mkv
|
||||
"AUDIO": "", # default audio language.
|
||||
"SUB": "", # default subtitle language. EX: "eng" or "spa"
|
||||
"GROUP": "WVDUMP", # to change the group name!. it's also possible to use this "--gr LOL", on the ripping commands.
|
||||
"noTitle": False, # this will remove titles from the episodes EX: (The Witcher S01E01). insstead of (The Witcher S01E01 The End's Beginning).
|
||||
"scheme": "p2p", # add/change any needed scheme naming. it's also possible to use this "--muxscheme repack", on the ripping commands.
|
||||
"schemeslist": {
|
||||
"p2p": "{t}.{r}.{s}.WEB-DL.{ac}.{vc}-{gr}",
|
||||
"test": "{t}.{r}.{s}.WEB-DL-{gr}",
|
||||
},
|
||||
"EXTRAS": [], # extra mkvmerge.exe commands.
|
||||
"FPS24": [],
|
||||
}
|
||||
|
||||
#####################################(PATHS)#####################################
|
||||
|
||||
PATHS = {
|
||||
"DL_FOLDER": f"{dirPath}", #
|
||||
"DIR_PATH": f"{dirPath}",
|
||||
"BINARY_PATH": f"{dirPath}/bin",
|
||||
"COOKIES_PATH": f"{dirPath}/configs/Cookies",
|
||||
"KEYS_PATH": f"{dirPath}/configs/KEYS",
|
||||
"TOKENS_PATH": f"{dirPath}/configs/Tokens",
|
||||
"JSON_PATH": f"{dirPath}/json",
|
||||
"LOGA_PATH": f"{dirPath}/helpers/bin/tools/aria2c",
|
||||
}
|
||||
|
||||
ARIA2C = {
|
||||
"enable_logging": False, # True
|
||||
}
|
||||
|
||||
SETTINGS = {
|
||||
"skip_video_demux": [],
|
||||
}
|
||||
|
||||
#####################################(VPN)#####################################
|
||||
|
||||
VPN = {
|
||||
"proxies": None, # "",
|
||||
"nordvpn": {
|
||||
"port": "80",
|
||||
"email": "xxx",
|
||||
"passwd": "xxx",
|
||||
"http": "http://{email}:{passwd}@{ip}:{port}",
|
||||
},
|
||||
"private": {
|
||||
"port": "8080",
|
||||
"email": "",
|
||||
"passwd": "",
|
||||
"http": "http://{email}:{passwd}@{ip}:{port}",
|
||||
},
|
||||
}
|
||||
|
||||
#####################################(BIN)#####################################
|
||||
|
||||
BIN = {
|
||||
"mp4decrypt_moded": f"{dirPath}/helpers/bin/tools/mp4decrypt.exe",
|
||||
"mp4dump": f"{dirPath}/helpers/bin/tools/mp4dump.exe",
|
||||
"ffmpeg": f"{dirPath}/helpers/bin/tools/ffmpeg.exe",
|
||||
"ffprobe": f"{dirPath}/helpers/bin/tools/ffprobe.exe",
|
||||
"MediaInfo": f"{dirPath}/helpers/bin/tools/MediaInfo.exe",
|
||||
"mkvmerge": f"{dirPath}/helpers/bin/tools/mkvmerge.exe",
|
||||
"aria2c": f"{dirPath}/helpers/bin/tools/aria2c.exe",
|
||||
}
|
||||
|
||||
#####################################(Config)#####################################
|
||||
|
||||
Config = {}
|
||||
|
||||
Config["NETFLIX"] = {
|
||||
"cookies_file": f"{dirPath}/configs/Cookies/cookies_nf.txt",
|
||||
"cookies_txt": f"{dirPath}/configs/Cookies/cookies.txt",
|
||||
"keys_file": f"{dirPath}/configs/KEYS/netflix.keys",
|
||||
"token_file": f"{dirPath}/configs/Tokens/netflix_token.json",
|
||||
"email": "xxx",
|
||||
"password": "xxx",
|
||||
"manifest_language": "en-US",
|
||||
"metada_language": "en",
|
||||
"manifestEsn": "NFCDIE-03-{}".format(utils().random_hex(30)),
|
||||
"androidEsn": "NFANDROID1-PRV-P-GOOGLEPIXEL=4=XL-12995-" + utils_.random_hex(64),
|
||||
}
|
||||
|
||||
#####################################(DIRS & FILES)##############################
|
||||
|
||||
def make_dirs():
|
||||
FILES = []
|
||||
|
||||
DIRS = [
|
||||
f"{dirPath}/configs/Cookies",
|
||||
f"{dirPath}/configs/Tokens",
|
||||
f"{dirPath}/helpers/bin/tools/aria2c",
|
||||
]
|
||||
|
||||
for dirs in DIRS:
|
||||
if not os.path.exists(dirs):
|
||||
os.makedirs(dirs)
|
||||
|
||||
for files in FILES:
|
||||
if not os.path.isfile(files):
|
||||
with open(files, "w") as f:
|
||||
f.write("\n")
|
||||
|
||||
make_dirs()
|
||||
|
||||
#####################################(tool)#####################################
|
||||
|
||||
class tool:
|
||||
def config(self, service):
|
||||
return Config[service]
|
||||
|
||||
def bin(self):
|
||||
return BIN
|
||||
|
||||
def vpn(self):
|
||||
return VPN
|
||||
|
||||
def paths(self):
|
||||
return PATHS
|
||||
|
||||
def muxer(self):
|
||||
return MUXER
|
||||
|
||||
def devices(self):
|
||||
return DEVICES
|
||||
|
||||
def aria2c(self):
|
||||
return ARIA2C
|
||||
|
||||
def video_settings(self):
|
||||
return SETTINGS
|
0
configs/keys/key.txt
Normal file
0
configs/keys/key.txt
Normal file
0
configs/keys/netflix.keys
Normal file
0
configs/keys/netflix.keys
Normal file
632
helpers/Muxer.py
Normal file
632
helpers/Muxer.py
Normal file
@ -0,0 +1,632 @@
|
||||
|
||||
import re, os, sys, subprocess, contextlib, json, glob
|
||||
from configs.config import tool
|
||||
from helpers.ripprocess import ripprocess
|
||||
from pymediainfo import MediaInfo
|
||||
import logging
|
||||
|
||||
|
||||
class Muxer(object):
|
||||
def __init__(self, **kwargs):
|
||||
self.logger = logging.getLogger(__name__)
|
||||
self.CurrentName_Original = kwargs.get("CurrentName", None)
|
||||
self.CurrentName = kwargs.get("CurrentName", None)
|
||||
self.SeasonFolder = kwargs.get("SeasonFolder", None)
|
||||
self.CurrentHeigh = kwargs.get("CurrentHeigh", None)
|
||||
self.CurrentWidth = kwargs.get("CurrentWidth", None)
|
||||
self.source_tag = kwargs.get("Source", None)
|
||||
self.AudioProfile = self.get_audio_id() # kwargs.get("AudioProfile", None)
|
||||
self.VideoProfile = self.get_video_id() # kwargs.get("VideoProfile", None)
|
||||
self.mkvmerge = tool().bin()["mkvmerge"]
|
||||
self.merge = []
|
||||
self.muxer_settings = tool().muxer()
|
||||
|
||||
##############################################################################
|
||||
self.packer = kwargs.get("group", None)
|
||||
self.extra_output_folder = self.packer["EXTRA_FOLDER"]
|
||||
self.Group = (
|
||||
self.packer["GROUP"]
|
||||
if self.packer["GROUP"]
|
||||
else self.muxer_settings["GROUP"]
|
||||
)
|
||||
self.muxer_scheme = (
|
||||
self.packer["SCHEME"]
|
||||
if self.packer["SCHEME"]
|
||||
else self.muxer_settings["scheme"]
|
||||
)
|
||||
|
||||
self.scheme = self.muxer_settings["schemeslist"][self.muxer_scheme]
|
||||
self.Extras = self.muxer_settings["EXTRAS"]
|
||||
self.fps24 = True if self.source_tag in self.muxer_settings["FPS24"] else False
|
||||
self.default_mux = True if self.muxer_settings["DEFAULT"] else False
|
||||
self.PrepareMuxer()
|
||||
|
||||
def is_extra_folder(self):
|
||||
extra_folder = None
|
||||
if self.extra_output_folder:
|
||||
if not os.path.isabs(self.extra_output_folder):
|
||||
raise ValueError("Error you should provide full path dir: {}.".format(self.extra_output_folder))
|
||||
if not os.path.exists(self.extra_output_folder):
|
||||
try:
|
||||
os.makedirs(self.extra_output_folder)
|
||||
except Exception as e:
|
||||
raise ValueError("Error when create folder dir [{}]: {}.".format(e, self.extra_output_folder))
|
||||
extra_folder = self.extra_output_folder
|
||||
return extra_folder
|
||||
|
||||
if self.muxer_settings["mkv_folder"]:
|
||||
if not os.path.isabs(self.muxer_settings["mkv_folder"]):
|
||||
raise ValueError("Error you should provide full path dir: {}.".format(self.muxer_settings["mkv_folder"]))
|
||||
if not os.path.exists(self.muxer_settings["mkv_folder"]):
|
||||
try:
|
||||
os.makedirs(self.muxer_settings["mkv_folder"])
|
||||
except Exception as e:
|
||||
raise ValueError("Error when create folder dir [{}]: {}.".format(e, self.muxer_settings["mkv_folder"]))
|
||||
extra_folder = self.muxer_settings["mkv_folder"]
|
||||
return extra_folder
|
||||
|
||||
return extra_folder
|
||||
|
||||
def PrepareMuxer(self):
|
||||
if self.muxer_settings["noTitle"]:
|
||||
self.CurrentName = self.noTitle()
|
||||
|
||||
extra_folder = self.is_extra_folder()
|
||||
|
||||
if extra_folder:
|
||||
self.SeasonFolder = extra_folder
|
||||
else:
|
||||
if not self.default_mux:
|
||||
if self.SeasonFolder:
|
||||
self.SeasonFolder = self.setFolder()
|
||||
|
||||
return
|
||||
|
||||
def SortFilesBySize(self):
|
||||
file_list = []
|
||||
audio_tracks = (
|
||||
glob.glob(f"{self.CurrentName_Original}*.eac3")
|
||||
+ glob.glob(f"{self.CurrentName_Original}*.ac3")
|
||||
+ glob.glob(f"{self.CurrentName_Original}*.aac")
|
||||
+ glob.glob(f"{self.CurrentName_Original}*.m4a")
|
||||
+ glob.glob(f"{self.CurrentName_Original}*.dts")
|
||||
)
|
||||
|
||||
if audio_tracks == []:
|
||||
raise FileNotFoundError("no audio files found")
|
||||
|
||||
for file in audio_tracks:
|
||||
file_list.append({"file": file, "size": os.path.getsize(file)})
|
||||
|
||||
file_list = sorted(file_list, key=lambda k: int(k["size"]))
|
||||
return file_list[-1]["file"]
|
||||
|
||||
def GetVideoFile(self):
|
||||
videofiles = [
|
||||
"{} [{}p]_Demuxed.mp4",
|
||||
"{} [{}p]_Demuxed.mp4",
|
||||
"{} [{}p] [UHD]_Demuxed.mp4",
|
||||
"{} [{}p] [UHD]_Demuxed.mp4",
|
||||
"{} [{}p] [VP9]_Demuxed.mp4",
|
||||
"{} [{}p] [HIGH]_Demuxed.mp4",
|
||||
"{} [{}p] [VP9]_Demuxed.mp4",
|
||||
"{} [{}p] [HEVC]_Demuxed.mp4",
|
||||
"{} [{}p] [HDR]_Demuxed.mp4",
|
||||
"{} [{}p] [HDR-DV]_Demuxed.mp4",
|
||||
]
|
||||
|
||||
for videofile in videofiles:
|
||||
filename = videofile.format(self.CurrentName_Original, self.CurrentHeigh)
|
||||
if os.path.isfile(filename):
|
||||
return filename
|
||||
|
||||
return None
|
||||
|
||||
def get_video_id(self):
|
||||
video_file = self.GetVideoFile()
|
||||
if not video_file:
|
||||
raise ValueError("No Video file in Dir...")
|
||||
|
||||
media_info = MediaInfo.parse(video_file)
|
||||
track = [track for track in media_info.tracks if track.track_type == "Video"][0]
|
||||
|
||||
if track.format == "AVC":
|
||||
if track.encoding_settings:
|
||||
return "x264"
|
||||
return "H.264"
|
||||
elif track.format == "HEVC":
|
||||
if track.commercial_name == "HDR10" and track.color_primaries:
|
||||
return "HDR.HEVC"
|
||||
if track.commercial_name == "HEVC" and track.color_primaries:
|
||||
return "HEVC"
|
||||
|
||||
return "DV.HEVC"
|
||||
|
||||
return None
|
||||
|
||||
def get_audio_id(self):
|
||||
audio_id = None
|
||||
media_info = MediaInfo.parse(self.SortFilesBySize())
|
||||
track = [track for track in media_info.tracks if track.track_type == "Audio"][0]
|
||||
|
||||
if track.format == "E-AC-3":
|
||||
audioCodec = "DDP"
|
||||
elif track.format == "AC-3":
|
||||
audioCodec = "DD"
|
||||
elif track.format == "AAC":
|
||||
audioCodec = "AAC"
|
||||
elif track.format == "DTS":
|
||||
audioCodec = "DTS"
|
||||
elif "DTS" in track.format:
|
||||
audioCodec = "DTS"
|
||||
else:
|
||||
audioCodec = "DDP"
|
||||
|
||||
if track.channel_s == 8:
|
||||
channels = "7.1"
|
||||
elif track.channel_s == 6:
|
||||
channels = "5.1"
|
||||
elif track.channel_s == 2:
|
||||
channels = "2.0"
|
||||
elif track.channel_s == 1:
|
||||
channels = "1.0"
|
||||
else:
|
||||
channels = "5.1"
|
||||
|
||||
audio_id = (
|
||||
f"{audioCodec}{channels}.Atmos"
|
||||
if "Atmos" in track.commercial_name
|
||||
else f"{audioCodec}{channels}"
|
||||
)
|
||||
|
||||
return audio_id
|
||||
|
||||
def Heigh(self):
|
||||
try:
|
||||
Width = int(self.CurrentWidth)
|
||||
Heigh = int(self.CurrentHeigh)
|
||||
except Exception:
|
||||
return self.CurrentHeigh
|
||||
|
||||
res1080p = "1080p"
|
||||
res720p = "720p"
|
||||
sd = ""
|
||||
|
||||
if Width >= 3840:
|
||||
return "2160p"
|
||||
|
||||
if Width >= 2560:
|
||||
return "1440p"
|
||||
|
||||
if Width > 1920:
|
||||
if Heigh > 1440:
|
||||
return "2160p"
|
||||
return "1440p"
|
||||
|
||||
if Width == 1920:
|
||||
return res1080p
|
||||
elif Width == 1280:
|
||||
return res720p
|
||||
|
||||
if Width >= 1400:
|
||||
return res1080p
|
||||
|
||||
if Width < 1400 and Width >= 1100:
|
||||
return res720p
|
||||
|
||||
if Heigh == 1080:
|
||||
return res1080p
|
||||
elif Heigh == 720:
|
||||
return res720p
|
||||
|
||||
if Heigh >= 900:
|
||||
return res1080p
|
||||
|
||||
if Heigh < 900 and Heigh >= 700:
|
||||
return res720p
|
||||
|
||||
return sd
|
||||
|
||||
def noTitle(self):
|
||||
regex = re.compile("(.*) [S]([0-9]+)[E]([0-9]+)")
|
||||
if regex.search(self.CurrentName):
|
||||
return regex.search(self.CurrentName).group(0)
|
||||
|
||||
return self.CurrentName
|
||||
|
||||
def Run(self, command):
|
||||
self.logger.debug("muxing command: {}".format(command))
|
||||
|
||||
def unbuffered(proc, stream="stdout"):
|
||||
newlines = ["\n", "\r\n", "\r"]
|
||||
stream = getattr(proc, stream)
|
||||
with contextlib.closing(stream):
|
||||
while True:
|
||||
out = []
|
||||
last = stream.read(1)
|
||||
# Don't loop forever
|
||||
if last == "" and proc.poll() is not None:
|
||||
break
|
||||
while last not in newlines:
|
||||
# Don't loop forever
|
||||
if last == "" and proc.poll() is not None:
|
||||
break
|
||||
out.append(last)
|
||||
last = stream.read(1)
|
||||
out = "".join(out)
|
||||
yield out
|
||||
|
||||
proc = subprocess.Popen(
|
||||
command,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.STDOUT,
|
||||
bufsize=1,
|
||||
universal_newlines=True,
|
||||
)
|
||||
self.logger.info("\nStart Muxing...")
|
||||
for line in unbuffered(proc):
|
||||
if "Progress:" in line:
|
||||
sys.stdout.write("\r%s" % (line))
|
||||
sys.stdout.flush()
|
||||
elif "Multiplexing" in line:
|
||||
sys.stdout.write("\r%s" % (line.replace("Multiplexing", "Muxing")))
|
||||
sys.stdout.flush()
|
||||
elif "Error" in line:
|
||||
sys.stdout.write("\r%s" % (line))
|
||||
sys.stdout.flush()
|
||||
|
||||
self.logger.info("")
|
||||
|
||||
def setName(self):
|
||||
|
||||
outputVideo = (
|
||||
self.scheme.replace(
|
||||
"{t}", ripprocess().CleanMyFileNamePlease(self.CurrentName)
|
||||
)
|
||||
.replace("{r}", self.Heigh())
|
||||
.replace("{s}", self.source_tag)
|
||||
.replace("{ac}", self.AudioProfile)
|
||||
.replace("{vc}", self.VideoProfile)
|
||||
.replace("{gr}", self.Group)
|
||||
)
|
||||
|
||||
for i in range(10):
|
||||
outputVideo = re.sub(r"(\.\.)", ".", outputVideo)
|
||||
|
||||
if self.SeasonFolder:
|
||||
outputVideo = os.path.join(os.path.abspath(self.SeasonFolder), outputVideo)
|
||||
outputVideo = outputVideo.replace("\\", "/")
|
||||
|
||||
return f"{outputVideo}.mkv"
|
||||
|
||||
def setFolder(self):
|
||||
folder = (
|
||||
self.scheme.replace(
|
||||
"{t}", ripprocess().CleanMyFileNamePlease(self.SeasonFolder)
|
||||
)
|
||||
.replace("{r}", self.Heigh())
|
||||
.replace("{s}", self.source_tag)
|
||||
.replace("{ac}", self.AudioProfile)
|
||||
.replace("{vc}", self.VideoProfile)
|
||||
.replace("{gr}", self.Group)
|
||||
)
|
||||
|
||||
for i in range(10):
|
||||
folder = re.sub(r"(\.\.)", ".", folder)
|
||||
|
||||
return folder
|
||||
|
||||
def LanguageList(self):
|
||||
LanguageList = [
|
||||
["English", "eng", "eng", "English"],
|
||||
["Afrikaans", "af", "afr", "Afrikaans"],
|
||||
["Arabic", "ara", "ara", "Arabic"],
|
||||
["Arabic (Syria)", "araSy", "ara", "Arabic Syria"],
|
||||
["Arabic (Egypt)", "araEG", "ara", "Arabic Egypt"],
|
||||
["Arabic (Kuwait)", "araKW", "ara", "Arabic Kuwait"],
|
||||
["Arabic (Lebanon)", "araLB", "ara", "Arabic Lebanon"],
|
||||
["Arabic (Algeria)", "araDZ", "ara", "Arabic Algeria"],
|
||||
["Arabic (Bahrain)", "araBH", "ara", "Arabic Bahrain"],
|
||||
["Arabic (Iraq)", "araIQ", "ara", "Arabic Iraq"],
|
||||
["Arabic (Jordan)", "araJO", "ara", "Arabic Jordan"],
|
||||
["Arabic (Libya)", "araLY", "ara", "Arabic Libya"],
|
||||
["Arabic (Morocco)", "araMA", "ara", "Arabic Morocco"],
|
||||
["Arabic (Oman)", "araOM", "ara", "Arabic Oman"],
|
||||
["Arabic (Saudi Arabia)", "araSA", "ara", "Arabic Saudi Arabia"],
|
||||
["Arabic (Tunisia)", "araTN", "ara", "Arabic Tunisia"],
|
||||
[
|
||||
"Arabic (United Arab Emirates)",
|
||||
"araAE",
|
||||
"ara",
|
||||
"Arabic United Arab Emirates",
|
||||
],
|
||||
["Arabic (Yemen)", "araYE", "ara", "Arabic Yemen"],
|
||||
["Armenian", "hye", "arm", "Armenian"],
|
||||
["Assamese", "asm", "asm", "Assamese"],
|
||||
["Bengali", "ben", "ben", "Bengali"],
|
||||
["Basque", "eus", "baq", "Basque"],
|
||||
["British English", "enGB", "eng", "British English"],
|
||||
["Bulgarian", "bul", "bul", "Bulgarian"],
|
||||
["Cantonese", "None", "chi", "Cantonese"],
|
||||
["Catalan", "cat", "cat", "Catalan"],
|
||||
["Croatian", "hrv", "hrv", "Croatian"],
|
||||
["Czech", "ces", "cze", "Czech"],
|
||||
["Danish", "dan", "dan", "Danish"],
|
||||
["Dutch", "nld", "dut", "Dutch"],
|
||||
["Estonian", "est", "est", "Estonian"],
|
||||
["Filipino", "fil", "fil", "Filipino"],
|
||||
["Finnish", "fin", "fin", "Finnish"],
|
||||
["Flemish", "nlBE", "dut", "Flemish"],
|
||||
["French", "fra", "fre", "French"],
|
||||
["French Canadian", "caFra", "fre", "French Canadian"],
|
||||
["Canadian French", "caFra", "fre", "Canadian French"],
|
||||
["German", "deu", "ger", "German"],
|
||||
["Greek", "ell", "gre", "Greek"],
|
||||
["Gujarati", "guj", "guj", "Gujarati"],
|
||||
["Hebrew", "heb", "heb", "Hebrew"],
|
||||
["Hungarian", "hun", "hun", "Hungarian"],
|
||||
["Icelandic", "isl", "ice", "Icelandic"],
|
||||
["Indonesian", "ind", "ind", "Indonesian"],
|
||||
["Italian", "ita", "ita", "Italian"],
|
||||
["Japanese", "jpn", "jpn", "Japanese"],
|
||||
["Kannada (India)", "kan", "kan", "Kannada (India)"],
|
||||
["Khmer", "khm", "khm", "Khmer"],
|
||||
["Klingon", "tlh", "tlh", "Klingon"],
|
||||
["Korean", "kor", "kor", "Korean"],
|
||||
["Lithuanian", "lit", "lit", "Lithuanian"],
|
||||
["Latvian", "lav", "lav", "Latvian"],
|
||||
["Malay", "msa", "may", "Malay"],
|
||||
["Malayalam", "mal", "mal", "Malayalam"],
|
||||
["Mandarin", "None", "chi", "Mandarin"],
|
||||
["Mandarin (Putonghua)", "zho", "chi", "Mandarin (Putonghua)"],
|
||||
["Mandarin Chinese (Simplified)", "zh-Hans", "chi", "Simplified"],
|
||||
["Mandarin Chinese (Traditional)", "zh-Hant", "chi", "Traditional"],
|
||||
["Yue Chinese", "yue", "chi", "(Yue Chinese)"],
|
||||
["Manipuri", "mni", "mni", "Manipuri"],
|
||||
["Marathi", "mar", "mar", "Marathi"],
|
||||
["No Dialogue", "zxx", "zxx", "No Dialogue"],
|
||||
["Norwegian", "nor", "nor", "Norwegian"],
|
||||
["Norwegian Bokmal", "nob", "nob", "Norwegian Bokmal"],
|
||||
["Persian", "fas", "per", "Persian"],
|
||||
["Polish", "pol", "pol", "Polish"],
|
||||
["Portuguese", "por", "por", "Portuguese"],
|
||||
["Brazilian Portuguese", "brPor", "por", "Brazilian Portuguese"],
|
||||
["Punjabi", "pan", "pan", "Punjabi"],
|
||||
["Panjabi", "pan", "pan", "Panjabi"],
|
||||
["Romanian", "ron", "rum", "Romanian"],
|
||||
["Russian", "rus", "rus", "Russian"],
|
||||
["Serbian", "srp", "srp", "Serbian"],
|
||||
["Sinhala", "sin", "sin", "Sinhala"],
|
||||
["Slovak", "slk", "slo", "Slovak"],
|
||||
["Slovenian", "slv", "slv", "Slovenian"],
|
||||
["Spanish", "spa", "spa", "Spanish"],
|
||||
["European Spanish", "euSpa", "spa", "European Spanish"],
|
||||
["Swedish", "swe", "swe", "Swedish"],
|
||||
["Hindi", "hin", "hin", "Hindi"],
|
||||
["Tamil", "tam", "tam", "Tamil"],
|
||||
["Telugu", "tel", "tel", "Telugu"],
|
||||
["Thai", "tha", "tha", "Thai"],
|
||||
["Tagalog", "tgl", "tgl", "Tagalog"],
|
||||
["Turkish", "tur", "tur", "Turkish"],
|
||||
["Ukrainian", "ukr", "ukr", "Ukrainian"],
|
||||
["Urdu", "urd", "urd", "Urdu"],
|
||||
["Vietnamese", "vie", "vie", "Vietnamese"],
|
||||
["Simplified Chinese", "zhoS", "chi", "Chinese Simplified"],
|
||||
["Traditional Chinese", "zhoT", "chi", "Chinese Traditional"],
|
||||
]
|
||||
|
||||
return LanguageList
|
||||
|
||||
def ExtraLanguageList(self):
|
||||
ExtraLanguageList = [
|
||||
["Polish - Dubbing", "pol", "pol", "Polish - Dubbing"],
|
||||
["Polish - Lektor", "pol", "pol", "Polish - Lektor"],
|
||||
]
|
||||
|
||||
return ExtraLanguageList
|
||||
|
||||
def AddChapters(self):
|
||||
if os.path.isfile(self.CurrentName_Original + " Chapters.txt"):
|
||||
self.merge += [
|
||||
"--chapter-charset",
|
||||
"UTF-8",
|
||||
"--chapters",
|
||||
self.CurrentName_Original + " Chapters.txt",
|
||||
]
|
||||
|
||||
return
|
||||
|
||||
def AddVideo(self):
|
||||
inputVideo = None
|
||||
|
||||
videofiles = [
|
||||
"{} [{}p]_Demuxed.mp4",
|
||||
"{} [{}p]_Demuxed.mp4",
|
||||
"{} [{}p] [UHD]_Demuxed.mp4",
|
||||
"{} [{}p] [UHD]_Demuxed.mp4",
|
||||
"{} [{}p] [VP9]_Demuxed.mp4",
|
||||
"{} [{}p] [HIGH]_Demuxed.mp4",
|
||||
"{} [{}p] [VP9]_Demuxed.mp4",
|
||||
"{} [{}p] [HEVC]_Demuxed.mp4",
|
||||
"{} [{}p] [HDR]_Demuxed.mp4",
|
||||
"{} [{}p] [HDR-DV]_Demuxed.mp4",
|
||||
]
|
||||
|
||||
for videofile in videofiles:
|
||||
filename = videofile.format(self.CurrentName_Original, self.CurrentHeigh)
|
||||
if os.path.isfile(filename):
|
||||
inputVideo = filename
|
||||
break
|
||||
|
||||
if not inputVideo:
|
||||
self.logger.info("cannot found video file.")
|
||||
exit(-1)
|
||||
|
||||
if self.default_mux:
|
||||
outputVideo = (
|
||||
re.compile("|".join([".h264", ".h265", ".vp9", ".mp4"])).sub("", inputVideo)
|
||||
+ ".mkv"
|
||||
)
|
||||
if self.SeasonFolder:
|
||||
outputVideo = os.path.join(
|
||||
os.path.abspath(self.SeasonFolder), outputVideo
|
||||
)
|
||||
outputVideo = outputVideo.replace("\\", "/")
|
||||
else:
|
||||
outputVideo = self.setName()
|
||||
|
||||
self.outputVideo = outputVideo
|
||||
|
||||
if self.fps24:
|
||||
self.merge += [
|
||||
self.mkvmerge,
|
||||
"--output",
|
||||
outputVideo,
|
||||
"--default-duration",
|
||||
"0:24000/1001p",
|
||||
"--language",
|
||||
"0:und",
|
||||
"--default-track",
|
||||
"0:yes",
|
||||
"(",
|
||||
inputVideo,
|
||||
")",
|
||||
]
|
||||
else:
|
||||
self.merge += [
|
||||
self.mkvmerge,
|
||||
"--output",
|
||||
outputVideo,
|
||||
"--title",
|
||||
'',
|
||||
"(",
|
||||
inputVideo,
|
||||
")",
|
||||
]
|
||||
|
||||
return
|
||||
|
||||
def AddAudio(self):
|
||||
|
||||
audiofiles = [
|
||||
"{} {}.ac3",
|
||||
"{} {} - Audio Description.ac3",
|
||||
"{} {}.eac3",
|
||||
"{} {} - Audio Description.eac3",
|
||||
"{} {}.aac",
|
||||
"{} {} - Audio Description.aac",
|
||||
"{} {}.m4a",
|
||||
"{} {} - Audio Description.m4a",
|
||||
]
|
||||
|
||||
for (audio_language, subs_language, language_id, language_name,) in (
|
||||
self.LanguageList() + self.ExtraLanguageList()
|
||||
):
|
||||
for audiofile in audiofiles:
|
||||
filename = audiofile.format(self.CurrentName_Original, audio_language)
|
||||
if os.path.isfile(filename):
|
||||
self.merge += [
|
||||
"--language",
|
||||
f"0:{language_id}",
|
||||
"--track-name",
|
||||
"0:Audio Description" if 'Audio Description' in filename
|
||||
else f"0:{language_name}",
|
||||
"--default-track",
|
||||
"0:yes"
|
||||
if subs_language == self.muxer_settings["AUDIO"]
|
||||
else "0:no",
|
||||
"(",
|
||||
filename,
|
||||
")",
|
||||
]
|
||||
|
||||
return
|
||||
|
||||
def AddSubtitles(self):
|
||||
|
||||
srts = [
|
||||
"{} {}.srt",
|
||||
]
|
||||
forceds = [
|
||||
"{} forced-{}.srt",
|
||||
]
|
||||
sdhs = [
|
||||
"{} sdh-{}.srt",
|
||||
]
|
||||
|
||||
for (
|
||||
audio_language,
|
||||
subs_language,
|
||||
language_id,
|
||||
language_name,
|
||||
) in self.LanguageList():
|
||||
for subtitle in srts:
|
||||
filename = subtitle.format(self.CurrentName_Original, subs_language)
|
||||
if os.path.isfile(filename):
|
||||
self.merge += [
|
||||
"--language",
|
||||
f"0:{language_id}",
|
||||
"--track-name",
|
||||
f"0:{language_name}",
|
||||
"--forced-track",
|
||||
"0:no",
|
||||
"--default-track",
|
||||
"0:yes"
|
||||
if subs_language == self.muxer_settings["SUB"]
|
||||
else "0:no",
|
||||
"--compression",
|
||||
"0:none",
|
||||
"(",
|
||||
filename,
|
||||
")",
|
||||
]
|
||||
|
||||
for subtitle in forceds:
|
||||
filename = subtitle.format(self.CurrentName_Original, subs_language)
|
||||
if os.path.isfile(filename):
|
||||
self.merge += [
|
||||
"--language",
|
||||
f"0:{language_id}",
|
||||
"--track-name",
|
||||
f"0:Forced",
|
||||
"--forced-track",
|
||||
"0:yes",
|
||||
"--default-track",
|
||||
"0:no",
|
||||
"--compression",
|
||||
"0:none",
|
||||
"(",
|
||||
filename,
|
||||
")",
|
||||
]
|
||||
|
||||
for subtitle in sdhs:
|
||||
filename = subtitle.format(self.CurrentName_Original, subs_language)
|
||||
if os.path.isfile(filename):
|
||||
self.merge += [
|
||||
"--language",
|
||||
f"0:{language_id}",
|
||||
"--track-name",
|
||||
f"0:SDH",
|
||||
"--forced-track",
|
||||
"0:no",
|
||||
"--default-track",
|
||||
"0:no",
|
||||
"--compression",
|
||||
"0:none",
|
||||
"(",
|
||||
filename,
|
||||
")",
|
||||
]
|
||||
|
||||
return
|
||||
|
||||
def startMux(self):
|
||||
self.AddVideo()
|
||||
self.AddAudio()
|
||||
self.AddSubtitles()
|
||||
self.AddChapters()
|
||||
if not os.path.isfile(self.outputVideo):
|
||||
self.Run(self.merge + self.Extras)
|
||||
|
||||
return self.outputVideo
|
551
helpers/Parsers/Netflix/MSLClient.py
Normal file
551
helpers/Parsers/Netflix/MSLClient.py
Normal file
@ -0,0 +1,551 @@
|
||||
import base64, binascii, json, os, re, random, requests, string, time, traceback, logging
|
||||
from datetime import datetime
|
||||
from Cryptodome.Cipher import AES, PKCS1_OAEP
|
||||
from Cryptodome.Util import Padding
|
||||
from Cryptodome.Hash import HMAC, SHA256
|
||||
from Cryptodome.PublicKey import RSA
|
||||
from pywidevine.cdm import cdm, deviceconfig
|
||||
from configs.config import tool
|
||||
|
||||
class MSLClient:
|
||||
def __init__(self, profiles=None, wv_keyexchange=True, proxies=None):
|
||||
|
||||
self.session = requests.session()
|
||||
self.logger = logging.getLogger(__name__)
|
||||
if proxies:
|
||||
self.session.proxies.update(proxies)
|
||||
|
||||
self.nf_endpoints = {
|
||||
"manifest": "https://www.netflix.com/nq/msl_v1/cadmium/pbo_manifests/^1.0.0/router",
|
||||
"license": "https://www.netflix.com/nq/msl_v1/cadmium/pbo_licenses/^1.0.0/router",
|
||||
}
|
||||
|
||||
######################################################################
|
||||
|
||||
self.config = tool().config("NETFLIX")
|
||||
self.email = self.config["email"]
|
||||
self.password = self.config["password"]
|
||||
self.device = tool().devices()["NETFLIX-MANIFEST"]
|
||||
self.save_rsa_location = self.config["token_file"]
|
||||
self.languages = self.config["manifest_language"]
|
||||
self.license_path = None
|
||||
|
||||
######################################################################
|
||||
|
||||
if os.path.isfile(self.save_rsa_location):
|
||||
self.generatePrivateKey = RSA.importKey(
|
||||
json.loads(open(self.save_rsa_location, "r").read())["RSA_KEY"]
|
||||
)
|
||||
else:
|
||||
self.generatePrivateKey = RSA.generate(2048)
|
||||
|
||||
if wv_keyexchange:
|
||||
self.wv_keyexchange = True
|
||||
self.cdm = cdm.Cdm()
|
||||
self.cdm_session = None
|
||||
else:
|
||||
self.wv_keyexchange = False
|
||||
self.cdm = None
|
||||
self.cdm_session = None
|
||||
|
||||
self.manifest_challenge = '' # set desired wv data to overide wvexchange data
|
||||
|
||||
self.profiles = profiles
|
||||
|
||||
self.logger.debug("Using profiles: {}".format(self.profiles))
|
||||
|
||||
esn = self.config["androidEsn"]
|
||||
if esn is None:
|
||||
self.logger.error(
|
||||
'\nandroid esn not found, set esn with cdm systemID in config.py'
|
||||
)
|
||||
else:
|
||||
self.esn = esn
|
||||
|
||||
self.logger.debug("Using esn: " + self.esn)
|
||||
|
||||
self.messageid = random.randint(0, 2 ** 52)
|
||||
self.session_keys = {} #~
|
||||
self.header = {
|
||||
"sender": self.esn,
|
||||
"handshake": True,
|
||||
"nonreplayable": 2,
|
||||
"capabilities": {"languages": [], "compressionalgos": []},
|
||||
"recipient": "Netflix",
|
||||
"renewable": True,
|
||||
"messageid": self.messageid,
|
||||
"timestamp": time.time(),
|
||||
}
|
||||
|
||||
self.setRSA()
|
||||
|
||||
def get_header_extra(self):
|
||||
|
||||
if self.wv_keyexchange:
|
||||
self.cdm_session = self.cdm.open_session(
|
||||
None,
|
||||
deviceconfig.DeviceConfig(self.device),
|
||||
b"\x0A\x7A\x00\x6C\x38\x2B",
|
||||
True,
|
||||
)
|
||||
wv_request = base64.b64encode(
|
||||
self.cdm.get_license_request(self.cdm_session)
|
||||
).decode("utf-8")
|
||||
|
||||
self.header["keyrequestdata"] = [
|
||||
{
|
||||
"scheme": "WIDEVINE",
|
||||
"keydata": {
|
||||
"keyrequest": wv_request
|
||||
}
|
||||
}
|
||||
]
|
||||
|
||||
else:
|
||||
self.header["keyrequestdata"] = [
|
||||
{
|
||||
"scheme": "ASYMMETRIC_WRAPPED",
|
||||
"keydata": {
|
||||
"publickey": base64.b64encode(
|
||||
self.generatePrivateKey.publickey().exportKey("DER")
|
||||
).decode("utf8"),
|
||||
"mechanism": "JWK_RSA",
|
||||
"keypairid": "rsaKeypairId",
|
||||
},
|
||||
}
|
||||
]
|
||||
|
||||
return self.header
|
||||
|
||||
def setRSA(self):
|
||||
if os.path.isfile(self.save_rsa_location):
|
||||
master_token = self.load_tokens()
|
||||
expires = master_token["expiration"]
|
||||
valid_until = datetime.utcfromtimestamp(int(expires))
|
||||
present_time = datetime.now()
|
||||
|
||||
difference = valid_until - present_time
|
||||
difference = difference.total_seconds() / 60 / 60
|
||||
if difference < 10:
|
||||
self.logger.debug("rsa file found. expired soon")
|
||||
self.session_keys["session_keys"] = self.generate_handshake()
|
||||
else:
|
||||
self.logger.debug("rsa file found")
|
||||
self.session_keys["session_keys"] = {
|
||||
"mastertoken": master_token["mastertoken"],
|
||||
"sequence_number": master_token["sequence_number"],
|
||||
"encryption_key": master_token["encryption_key"],
|
||||
"sign_key": master_token["sign_key"],
|
||||
}
|
||||
else:
|
||||
self.logger.debug("rsa file not found")
|
||||
self.session_keys["session_keys"] = self.generate_handshake()
|
||||
|
||||
def load_playlist(self, viewable_id):
|
||||
|
||||
payload = {
|
||||
"version": 2,
|
||||
"url": "/manifest", #"/licensedManifest"
|
||||
"id": int(time.time()),
|
||||
"languages": self.languages,
|
||||
"params": {
|
||||
#"challenge": self.manifest_challenge,
|
||||
"type": "standard",
|
||||
"viewableId": viewable_id,
|
||||
"profiles": self.profiles,
|
||||
"flavor": "STANDARD", #'PRE_FETCH'
|
||||
"drmType": "widevine",
|
||||
"usePsshBox": True,
|
||||
"useHttpsStreams": True,
|
||||
"supportsPreReleasePin": True,
|
||||
"supportsWatermark": True,
|
||||
'supportsUnequalizedDownloadables': True,
|
||||
'requestEligibleABTests': True,
|
||||
"isBranching": False,
|
||||
'isNonMember': False,
|
||||
'isUIAutoPlay': False,
|
||||
"imageSubtitleHeight": 1080,
|
||||
"uiVersion": "shakti-v4bf615c3",
|
||||
'uiPlatform': 'SHAKTI',
|
||||
"clientVersion": "6.0026.291.011",
|
||||
'desiredVmaf': 'plus_lts', # phone_plus_exp
|
||||
"showAllSubDubTracks": True,
|
||||
#"preferredTextLocale": "ar",
|
||||
#"preferredAudioLocale": "ar",
|
||||
#"maxSupportedLanguages": 2,
|
||||
"preferAssistiveAudio": False,
|
||||
"deviceSecurityLevel": "3000",
|
||||
'licenseType': 'standard',
|
||||
'titleSpecificData': {
|
||||
str(viewable_id): {
|
||||
'unletterboxed': True
|
||||
}
|
||||
},
|
||||
"videoOutputInfo": [
|
||||
{
|
||||
"type": "DigitalVideoOutputDescriptor",
|
||||
"outputType": "unknown",
|
||||
"supportedHdcpVersions": ['2.2'],
|
||||
"isHdcpEngaged": True,
|
||||
}
|
||||
],
|
||||
},
|
||||
}
|
||||
|
||||
request_data = self.msl_request(payload)
|
||||
response = self.session.post(self.nf_endpoints["manifest"], data=request_data)
|
||||
manifest = json.loads(json.dumps(self.decrypt_response(response.text)))
|
||||
|
||||
if manifest.get("result"):
|
||||
#with open('videoTraks.json', 'w', encoding='utf-8') as d:
|
||||
#["result"]["video_tracks"]
|
||||
# d.write(json.dumps(manifest, indent=2))
|
||||
self.license_path = manifest["result"]["links"]["license"]["href"]
|
||||
return manifest
|
||||
|
||||
if manifest.get("errormsg"):
|
||||
self.logger.info(manifest["errormsg"])
|
||||
return None
|
||||
else:
|
||||
self.logger.info(manifest)
|
||||
return None
|
||||
|
||||
def decrypt_response(self, payload):
|
||||
errored = False
|
||||
try:
|
||||
p = json.loads(payload)
|
||||
if p.get("errordata"):
|
||||
return json.loads(base64.b64decode(p["errordata"]).decode())
|
||||
except:
|
||||
payloads = re.split(
|
||||
r',"signature":"[0-9A-Za-z/+=]+"}', payload.split("}}")[1]
|
||||
)
|
||||
payloads = [x + "}" for x in payloads]
|
||||
new_payload = payloads[:-1]
|
||||
|
||||
chunks = []
|
||||
for chunk in new_payload:
|
||||
try:
|
||||
payloadchunk = json.loads(chunk)["payload"]
|
||||
encryption_envelope = payloadchunk
|
||||
cipher = AES.new(
|
||||
self.session_keys["session_keys"]["encryption_key"],
|
||||
AES.MODE_CBC,
|
||||
base64.b64decode(
|
||||
json.loads(
|
||||
base64.b64decode(encryption_envelope).decode("utf8")
|
||||
)["iv"]
|
||||
),
|
||||
)
|
||||
|
||||
plaintext = cipher.decrypt(
|
||||
base64.b64decode(
|
||||
json.loads(
|
||||
base64.b64decode(encryption_envelope).decode("utf8")
|
||||
)["ciphertext"]
|
||||
)
|
||||
)
|
||||
|
||||
plaintext = json.loads(Padding.unpad(plaintext, 16).decode("utf8"))
|
||||
|
||||
data = plaintext["data"]
|
||||
data = base64.b64decode(data).decode("utf8")
|
||||
chunks.append(data)
|
||||
except:
|
||||
continue
|
||||
|
||||
decrypted_payload = "".join(chunks)
|
||||
try:
|
||||
return json.loads(decrypted_payload)
|
||||
except:
|
||||
traceback.print_exc()
|
||||
self.logger.info("Unable to decrypt payloads...exiting")
|
||||
exit(-1)
|
||||
|
||||
def generate_handshake(self):
|
||||
self.logger.debug("generate_handshake")
|
||||
header = self.get_header_extra()
|
||||
|
||||
request = {
|
||||
"entityauthdata": {
|
||||
"scheme": "NONE",
|
||||
"authdata": {"identity": self.esn,}
|
||||
},
|
||||
"signature": "",
|
||||
"headerdata": base64.b64encode(json.dumps(header).encode("utf8")).decode("utf8"),
|
||||
}
|
||||
response = self.session.post(
|
||||
url=self.nf_endpoints["manifest"],
|
||||
json=request,
|
||||
)
|
||||
try:
|
||||
if response.json().get("errordata"):
|
||||
self.logger.info("ERROR")
|
||||
self.logger.info(
|
||||
base64.b64decode(response.json()["errordata"]).decode()
|
||||
)
|
||||
exit(-1)
|
||||
handshake = self.parse_handshake(response=response.json())
|
||||
return handshake
|
||||
except:
|
||||
traceback.print_exc()
|
||||
self.logger.info(response.text)
|
||||
exit(-1)
|
||||
|
||||
def load_tokens(self):
|
||||
|
||||
with open(self.save_rsa_location, "r", encoding='utf-8') as f:
|
||||
tokens_data = json.loads(f.read())
|
||||
|
||||
data = {
|
||||
"mastertoken": tokens_data["mastertoken"],
|
||||
"sequence_number": tokens_data["sequence_number"],
|
||||
"encryption_key": base64.standard_b64decode(tokens_data["encryption_key"]),
|
||||
"sign_key": base64.standard_b64decode(tokens_data["sign_key"]),
|
||||
"RSA_KEY": tokens_data["RSA_KEY"],
|
||||
"expiration": tokens_data["expiration"],
|
||||
}
|
||||
|
||||
return data
|
||||
|
||||
def save_tokens(self, tokens_data):
|
||||
|
||||
data = {
|
||||
"mastertoken": tokens_data["mastertoken"],
|
||||
"sequence_number": tokens_data["sequence_number"],
|
||||
"encryption_key": base64.standard_b64encode(
|
||||
tokens_data["encryption_key"]
|
||||
).decode("utf-8"),
|
||||
"sign_key": base64.standard_b64encode(tokens_data["sign_key"]).decode(
|
||||
"utf-8"
|
||||
),
|
||||
"RSA_KEY": tokens_data["RSA_KEY"],
|
||||
"expiration": tokens_data["expiration"],
|
||||
}
|
||||
|
||||
with open(self.save_rsa_location, 'w', encoding='utf-8') as f:
|
||||
f.write(json.dumps(data, indent=2))
|
||||
|
||||
def parse_handshake(self, response):
|
||||
headerdata = json.loads(base64.b64decode(response["headerdata"]).decode("utf8"))
|
||||
|
||||
keyresponsedata = headerdata["keyresponsedata"]
|
||||
mastertoken = headerdata["keyresponsedata"]["mastertoken"]
|
||||
sequence_number = json.loads(
|
||||
base64.b64decode(mastertoken["tokendata"]).decode("utf8")
|
||||
)["sequencenumber"]
|
||||
|
||||
if self.wv_keyexchange:
|
||||
expected_scheme = "WIDEVINE"
|
||||
else:
|
||||
expected_scheme = "ASYMMETRIC_WRAPPED"
|
||||
|
||||
scheme = keyresponsedata["scheme"]
|
||||
|
||||
if scheme != expected_scheme:
|
||||
self.logger.info("Key Exchange failed:")
|
||||
return False
|
||||
|
||||
keydata = keyresponsedata["keydata"]
|
||||
|
||||
if self.wv_keyexchange:
|
||||
encryption_key, sign_key = self.__process_wv_keydata(keydata)
|
||||
else:
|
||||
encryption_key, sign_key = self.__parse_rsa_wrapped_crypto_keys(keydata)
|
||||
|
||||
tokens_data = {
|
||||
"mastertoken": mastertoken,
|
||||
"sequence_number": sequence_number,
|
||||
"encryption_key": encryption_key,
|
||||
"sign_key": sign_key,
|
||||
}
|
||||
|
||||
tokens_data_save = tokens_data
|
||||
tokens_data_save.update(
|
||||
{"RSA_KEY": self.generatePrivateKey.exportKey().decode()}
|
||||
)
|
||||
tokens_data_save.update(
|
||||
{
|
||||
"expiration": json.loads(
|
||||
base64.b64decode(
|
||||
json.loads(base64.b64decode(response["headerdata"]))[
|
||||
"keyresponsedata"
|
||||
]["mastertoken"]["tokendata"]
|
||||
)
|
||||
)["expiration"]
|
||||
}
|
||||
)
|
||||
self.save_tokens(tokens_data_save)
|
||||
return tokens_data
|
||||
|
||||
def __process_wv_keydata(self, keydata):
|
||||
|
||||
wv_response_b64 = keydata["cdmkeyresponse"] # pass as b64
|
||||
encryptionkeyid = base64.standard_b64decode(keydata["encryptionkeyid"])
|
||||
hmackeyid = base64.standard_b64decode(keydata["hmackeyid"])
|
||||
self.cdm.provide_license(self.cdm_session, wv_response_b64)
|
||||
keys = self.cdm.get_keys(self.cdm_session)
|
||||
self.logger.debug("wv key exchange: obtained wv key exchange keys %s" % keys)
|
||||
return (
|
||||
self.__find_wv_key(encryptionkeyid, keys, ["AllowEncrypt", "AllowDecrypt"]),
|
||||
self.__find_wv_key(hmackeyid, keys, ["AllowSign", "AllowSignatureVerify"]),
|
||||
)
|
||||
|
||||
def __find_wv_key(self, kid, keys, permissions):
|
||||
for key in keys:
|
||||
if key.kid != kid:
|
||||
continue
|
||||
if key.type != "OPERATOR_SESSION":
|
||||
self.logger.debug(
|
||||
"wv key exchange: Wrong key type (not operator session) key %s"
|
||||
% key
|
||||
)
|
||||
continue
|
||||
|
||||
if not set(permissions) <= set(key.permissions):
|
||||
self.logger.debug(
|
||||
"wv key exchange: Incorrect permissions, key %s, needed perms %s"
|
||||
% (key, permissions)
|
||||
)
|
||||
continue
|
||||
return key.key
|
||||
|
||||
return None
|
||||
|
||||
def __parse_rsa_wrapped_crypto_keys(self, keydata):
|
||||
# Init Decryption
|
||||
encrypted_encryption_key = base64.b64decode(keydata["encryptionkey"])
|
||||
|
||||
encrypted_sign_key = base64.b64decode(keydata["hmackey"])
|
||||
|
||||
oaep_cipher = PKCS1_OAEP.new(self.generatePrivateKey)
|
||||
encryption_key_data = json.loads(
|
||||
oaep_cipher.decrypt(encrypted_encryption_key).decode("utf8")
|
||||
)
|
||||
|
||||
encryption_key = self.base64_check(encryption_key_data["k"])
|
||||
|
||||
sign_key_data = json.loads(
|
||||
oaep_cipher.decrypt(encrypted_sign_key).decode("utf8")
|
||||
)
|
||||
|
||||
sign_key = self.base64_check(sign_key_data["k"])
|
||||
return (encryption_key, sign_key)
|
||||
|
||||
def base64key_decode(self, payload):
|
||||
l = len(payload) % 4
|
||||
if l == 2:
|
||||
payload += "=="
|
||||
elif l == 3:
|
||||
payload += "="
|
||||
elif l != 0:
|
||||
raise ValueError("Invalid base64 string")
|
||||
return base64.urlsafe_b64decode(payload.encode("utf-8"))
|
||||
|
||||
def base64_check(self, string):
|
||||
|
||||
while len(string) % 4 != 0:
|
||||
string = string + "="
|
||||
return base64.urlsafe_b64decode(string.encode())
|
||||
|
||||
def msl_request(self, data, is_handshake=False):
|
||||
|
||||
header = self.header.copy()
|
||||
header["handshake"] = is_handshake
|
||||
header["userauthdata"] = {
|
||||
"scheme": "EMAIL_PASSWORD",
|
||||
"authdata": {"email": self.email, "password": self.password},
|
||||
}
|
||||
|
||||
header_envelope = self.msl_encrypt(self.session_keys, json.dumps(header))
|
||||
|
||||
header_signature = HMAC.new(
|
||||
self.session_keys["session_keys"]["sign_key"], header_envelope, SHA256
|
||||
).digest()
|
||||
|
||||
encrypted_header = {
|
||||
"headerdata": base64.b64encode(header_envelope).decode("utf8"),
|
||||
"signature": base64.b64encode(header_signature).decode("utf8"),
|
||||
"mastertoken": self.session_keys["session_keys"]["mastertoken"],
|
||||
}
|
||||
|
||||
payload = {
|
||||
"messageid": self.messageid,
|
||||
"data": base64.b64encode(json.dumps(data).encode()).decode("utf8"),
|
||||
"sequencenumber": 1,
|
||||
"endofmsg": True,
|
||||
}
|
||||
|
||||
payload_envelope = self.msl_encrypt(self.session_keys, json.dumps(payload))
|
||||
|
||||
payload_signature = HMAC.new(
|
||||
self.session_keys["session_keys"]["sign_key"], payload_envelope, SHA256
|
||||
).digest()
|
||||
|
||||
payload_chunk = {
|
||||
"payload": base64.b64encode(payload_envelope).decode("utf8"),
|
||||
"signature": base64.b64encode(payload_signature).decode("utf8"),
|
||||
}
|
||||
return json.dumps(encrypted_header) + json.dumps(payload_chunk)
|
||||
|
||||
def msl_encrypt(self, msl_session, plaintext):
|
||||
|
||||
cbc_iv = os.urandom(16)
|
||||
encryption_envelope = {
|
||||
"keyid": "%s_%s"
|
||||
% (self.esn, msl_session["session_keys"]["sequence_number"]),
|
||||
"sha256": "AA==",
|
||||
"iv": base64.b64encode(cbc_iv).decode("utf8"),
|
||||
}
|
||||
|
||||
plaintext = Padding.pad(plaintext.encode("utf8"), 16)
|
||||
cipher = AES.new(
|
||||
msl_session["session_keys"]["encryption_key"], AES.MODE_CBC, cbc_iv
|
||||
)
|
||||
|
||||
ciphertext = cipher.encrypt(plaintext)
|
||||
|
||||
encryption_envelope["ciphertext"] = base64.b64encode(ciphertext).decode("utf8")
|
||||
|
||||
return json.dumps(encryption_envelope).encode("utf8")
|
||||
|
||||
def get_license(self, challenge, session_id):
|
||||
|
||||
if not isinstance(challenge, bytes):
|
||||
raise TypeError("challenge must be of type bytes")
|
||||
|
||||
if not isinstance(session_id, str):
|
||||
raise TypeError("session_id must be of type string")
|
||||
|
||||
timestamp = int(time.time() * 10000)
|
||||
|
||||
license_request_data = {
|
||||
"version": 2,
|
||||
"url": self.license_path,
|
||||
"id": timestamp,
|
||||
"languages": "en_US",
|
||||
"echo": "drmsessionId",
|
||||
"params": [
|
||||
{
|
||||
"drmSessionId": session_id,
|
||||
"clientTime": int(timestamp / 10000),
|
||||
"challengeBase64": base64.b64encode(challenge).decode("utf8"),
|
||||
"xid": str(timestamp + 1610),
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
request_data = self.msl_request(license_request_data)
|
||||
|
||||
resp = self.session.post(url=self.nf_endpoints["license"],data=request_data)
|
||||
|
||||
try:
|
||||
resp.json()
|
||||
|
||||
except ValueError:
|
||||
msl_license_data = json.loads(json.dumps(self.decrypt_response(resp.text)))
|
||||
if msl_license_data.get("result"):
|
||||
return msl_license_data
|
||||
if msl_license_data.get("errormsg"):
|
||||
raise ValueError(msl_license_data["errormsg"])
|
||||
raise ValueError(msl_license_data)
|
162
helpers/Parsers/Netflix/get_keys.py
Normal file
162
helpers/Parsers/Netflix/get_keys.py
Normal file
@ -0,0 +1,162 @@
|
||||
import time, os, json, logging, base64
|
||||
from helpers.Parsers.Netflix.MSLClient import MSLClient
|
||||
from configs.config import tool
|
||||
from pywidevine.decrypt.wvdecryptcustom import WvDecrypt
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
''' "av1-main-L20-dash-cbcs-prk",
|
||||
"av1-main-L21-dash-cbcs-prk",
|
||||
"av1-main-L30-dash-cbcs-prk",
|
||||
"av1-main-L31-dash-cbcs-prk",
|
||||
"av1-main-L40-dash-cbcs-prk",
|
||||
"av1-main-L41-dash-cbcs-prk",
|
||||
"av1-main-L50-dash-cbcs-prk",
|
||||
"av1-main-L51-dash-cbcs-prk",'''
|
||||
|
||||
''' "vp9-profile0-L21-dash-cenc",
|
||||
"vp9-profile0-L30-dash-cenc",
|
||||
"vp9-profile0-L31-dash-cenc",
|
||||
"vp9-profile0-L40-dash-cenc",
|
||||
"vp9-profile2-L30-dash-cenc-prk",
|
||||
"vp9-profile2-L31-dash-cenc-prk",
|
||||
"vp9-profile2-L40-dash-cenc-prk",
|
||||
"vp9-profile2-L50-dash-cenc-prk",
|
||||
"vp9-profile2-L51-dash-cenc-prk"'''
|
||||
|
||||
def from_kid(kid):
|
||||
array_of_bytes = bytearray(b"\x00\x00\x002pssh\x00\x00\x00\x00")
|
||||
array_of_bytes.extend(bytes.fromhex("edef8ba979d64acea3c827dcd51d21ed"))
|
||||
array_of_bytes.extend(b"\x00\x00\x00\x12\x12\x10")
|
||||
array_of_bytes.extend(bytes.fromhex(kid.replace("-", "")))
|
||||
pssh = base64.b64encode(bytes.fromhex(array_of_bytes.hex()))
|
||||
return pssh.decode()
|
||||
|
||||
def __profiles(profile, addHEVCDO=False):
|
||||
|
||||
profiles = [
|
||||
"heaac-2-dash",
|
||||
"dfxp-ls-sdh",
|
||||
"webvtt-lssdh-ios8",
|
||||
"BIF240",
|
||||
"BIF320",
|
||||
]
|
||||
|
||||
if profile == "High KEYS":
|
||||
profiles += [
|
||||
"playready-h264hpl22-dash",
|
||||
"playready-h264hpl30-dash",
|
||||
"playready-h264hpl31-dash",
|
||||
#'playready-h264hpl40-dash'
|
||||
]
|
||||
|
||||
elif profile == "Main KEYS":
|
||||
profiles += [
|
||||
"playready-h264bpl30-dash",
|
||||
"playready-h264mpl30-dash",
|
||||
"playready-h264mpl31-dash",
|
||||
"playready-h264mpl40-dash",
|
||||
]
|
||||
|
||||
elif profile == "HEVC KEYS":
|
||||
profiles += [
|
||||
"hevc-main-L30-dash-cenc",
|
||||
"hevc-main10-L30-dash-cenc",
|
||||
"hevc-main10-L30-dash-cenc-prk",
|
||||
"hevc-main-L31-dash-cenc"
|
||||
"hevc-main10-L31-dash-cenc",
|
||||
"hevc-main10-L31-dash-cenc-prk",
|
||||
"hevc-main-L40-dash-cenc",
|
||||
"hevc-main10-L40-dash-cenc",
|
||||
"hevc-main10-L40-dash-cenc-prk",
|
||||
"hevc-main-L41-dash-cenc",
|
||||
"hevc-main10-L41-dash-cenc",
|
||||
"hevc-main10-L41-dash-cenc-prk"
|
||||
]
|
||||
if addHEVCDO:
|
||||
profiles += [
|
||||
"hevc-main10-L31-dash-cenc-prk-do",
|
||||
"hevc-main10-L31-dash-cenc-prk-do",
|
||||
"hevc-main10-L40-dash-cenc-prk-do",
|
||||
"hevc-main10-L41-dash-cenc-prk-do",
|
||||
]
|
||||
|
||||
elif profile == 'HDR-10 KEYS':
|
||||
profiles += [
|
||||
"hevc-hdr-main10-L30-dash-cenc",
|
||||
"hevc-hdr-main10-L30-dash-cenc-prk",
|
||||
"hevc-hdr-main10-L31-dash-cenc",
|
||||
"hevc-hdr-main10-L31-dash-cenc-prk",
|
||||
"hevc-hdr-main10-L40-dash-cenc",
|
||||
"hevc-hdr-main10-L41-dash-cenc",
|
||||
"hevc-hdr-main10-L40-dash-cenc-prk",
|
||||
"hevc-hdr-main10-L41-dash-cenc-prk"
|
||||
]
|
||||
else:
|
||||
profiles += [
|
||||
"playready-h264mpl30-dash",
|
||||
]
|
||||
|
||||
return profiles
|
||||
|
||||
def GettingKEYS_Netflixv2(nfID, profile): #
|
||||
|
||||
KEYS = []
|
||||
|
||||
available_profiles = [
|
||||
"High KEYS",
|
||||
"HEVC KEYS",
|
||||
"HDR-10 KEYS",
|
||||
"Main KEYS"
|
||||
]
|
||||
|
||||
if not profile in available_profiles:
|
||||
logger.info("Error: Unknown profile: {}".format(profile))
|
||||
exit(1)
|
||||
|
||||
logger.info(f"\nGetting {profile}...")
|
||||
|
||||
profiles = __profiles(profile)
|
||||
|
||||
try:
|
||||
client = MSLClient(profiles=profiles)
|
||||
resp = client.load_playlist(int(nfID))
|
||||
if resp is None:
|
||||
if profile == 'HEVC KEYS':
|
||||
profiles = __profiles(profile, addHEVCDO=True)
|
||||
client = MSLClient(profiles=profiles)
|
||||
resp = client.load_playlist(int(nfID))
|
||||
|
||||
except Exception as e:
|
||||
logger.error("Manifest Error: {}".format(e))
|
||||
return KEYS
|
||||
|
||||
try:
|
||||
#init_data_b64 = from_kid('0000000005edabd50000000000000000')
|
||||
init_data_b64 = resp["result"]["video_tracks"][0]["drmHeader"]["bytes"]
|
||||
except KeyError:
|
||||
logger.error("cannot get pssh, {}".format(resp))
|
||||
return KEYS
|
||||
|
||||
cert_data_b64 = "CAUSwwUKvQIIAxIQ5US6QAvBDzfTtjb4tU/7QxiH8c+TBSKOAjCCAQoCggEBAObzvlu2hZRsapAPx4Aa4GUZj4/GjxgXUtBH4THSkM40x63wQeyVxlEEo1D/T1FkVM/S+tiKbJiIGaT0Yb5LTAHcJEhODB40TXlwPfcxBjJLfOkF3jP6wIlqbb6OPVkDi6KMTZ3EYL6BEFGfD1ag/LDsPxG6EZIn3k4S3ODcej6YSzG4TnGD0szj5m6uj/2azPZsWAlSNBRUejmP6Tiota7g5u6AWZz0MsgCiEvnxRHmTRee+LO6U4dswzF3Odr2XBPD/hIAtp0RX8JlcGazBS0GABMMo2qNfCiSiGdyl2xZJq4fq99LoVfCLNChkn1N2NIYLrStQHa35pgObvhwi7ECAwEAAToQdGVzdC5uZXRmbGl4LmNvbRKAA4TTLzJbDZaKfozb9vDv5qpW5A/DNL9gbnJJi/AIZB3QOW2veGmKT3xaKNQ4NSvo/EyfVlhc4ujd4QPrFgYztGLNrxeyRF0J8XzGOPsvv9Mc9uLHKfiZQuy21KZYWF7HNedJ4qpAe6gqZ6uq7Se7f2JbelzENX8rsTpppKvkgPRIKLspFwv0EJQLPWD1zjew2PjoGEwJYlKbSbHVcUNygplaGmPkUCBThDh7p/5Lx5ff2d/oPpIlFvhqntmfOfumt4i+ZL3fFaObvkjpQFVAajqmfipY0KAtiUYYJAJSbm2DnrqP7+DmO9hmRMm9uJkXC2MxbmeNtJHAHdbgKsqjLHDiqwk1JplFMoC9KNMp2pUNdX9TkcrtJoEDqIn3zX9p+itdt3a9mVFc7/ZL4xpraYdQvOwP5LmXj9galK3s+eQJ7bkX6cCi+2X+iBmCMx4R0XJ3/1gxiM5LiStibCnfInub1nNgJDojxFA3jH/IuUcblEf/5Y0s1SzokBnR8V0KbA=="
|
||||
|
||||
device = tool().devices()["NETFLIX-LICENSE"]
|
||||
|
||||
wvdecrypt = WvDecrypt(
|
||||
init_data_b64=init_data_b64, cert_data_b64=cert_data_b64, device=device
|
||||
)
|
||||
challenge = wvdecrypt.get_challenge()
|
||||
current_sessionId = str(time.time()).replace(".", "")[0:-2]
|
||||
data = client.get_license(challenge, current_sessionId)
|
||||
|
||||
try:
|
||||
license_b64 = data["result"][0]["licenseResponseBase64"]
|
||||
except Exception:
|
||||
logger.error("MSL LICENSE Error Message: {}".format(data))
|
||||
return KEYS
|
||||
|
||||
wvdecrypt.update_license(license_b64)
|
||||
Correct, keyswvdecrypt = wvdecrypt.start_process()
|
||||
KEYS = keyswvdecrypt
|
||||
|
||||
return KEYS
|
736
helpers/Parsers/Netflix/get_manifest.py
Normal file
736
helpers/Parsers/Netflix/get_manifest.py
Normal file
@ -0,0 +1,736 @@
|
||||
from helpers.ripprocess import ripprocess
|
||||
from helpers.Parsers.Netflix.MSLClient import MSLClient
|
||||
from configs.config import tool
|
||||
import re, os, json, logging
|
||||
|
||||
def MSLprofiles():
|
||||
PROFILES = {
|
||||
"BASICS": ["BIF240", "BIF320", "webvtt-lssdh-ios8", "dfxp-ls-sdh"],
|
||||
"MAIN": {
|
||||
"SD": [
|
||||
"playready-h264bpl30-dash",
|
||||
"playready-h264mpl22-dash",
|
||||
"playready-h264mpl30-dash",
|
||||
],
|
||||
"HD": [
|
||||
"playready-h264bpl30-dash",
|
||||
"playready-h264mpl22-dash",
|
||||
"playready-h264mpl30-dash",
|
||||
"playready-h264mpl31-dash",
|
||||
],
|
||||
"FHD": [
|
||||
"playready-h264bpl30-dash",
|
||||
"playready-h264mpl22-dash",
|
||||
"playready-h264mpl30-dash",
|
||||
"playready-h264mpl31-dash",
|
||||
"playready-h264mpl40-dash",
|
||||
],
|
||||
"ALL": [
|
||||
"playready-h264bpl30-dash",
|
||||
"playready-h264mpl22-dash",
|
||||
"playready-h264mpl30-dash",
|
||||
"playready-h264mpl31-dash",
|
||||
"playready-h264mpl40-dash",
|
||||
],
|
||||
},
|
||||
"HIGH": {
|
||||
"SD": [
|
||||
"playready-h264hpl22-dash",
|
||||
"playready-h264hpl30-dash",
|
||||
],
|
||||
"HD": [
|
||||
"playready-h264hpl22-dash",
|
||||
"playready-h264hpl30-dash",
|
||||
"playready-h264hpl31-dash",
|
||||
],
|
||||
"FHD": [
|
||||
"playready-h264hpl22-dash",
|
||||
"playready-h264hpl30-dash",
|
||||
"playready-h264hpl31-dash",
|
||||
"playready-h264hpl40-dash",
|
||||
],
|
||||
"ALL": [
|
||||
"playready-h264hpl22-dash",
|
||||
"playready-h264hpl30-dash",
|
||||
"playready-h264hpl31-dash",
|
||||
"playready-h264hpl40-dash",
|
||||
],
|
||||
},
|
||||
"HEVC": {
|
||||
"SD": [
|
||||
"hevc-main-L30-dash-cenc",
|
||||
"hevc-main10-L30-dash-cenc",
|
||||
"hevc-main10-L30-dash-cenc-prk",
|
||||
],
|
||||
"HD": [
|
||||
"hevc-main-L30-dash-cenc",
|
||||
"hevc-main10-L30-dash-cenc",
|
||||
"hevc-main10-L30-dash-cenc-prk",
|
||||
"hevc-main-L31-dash-cenc",
|
||||
"hevc-main10-L31-dash-cenc",
|
||||
"hevc-main10-L31-dash-cenc-prk",
|
||||
],
|
||||
"FHD": [
|
||||
"hevc-main-L30-dash-cenc",
|
||||
"hevc-main10-L30-dash-cenc",
|
||||
"hevc-main10-L30-dash-cenc-prk",
|
||||
"hevc-main-L31-dash-cenc"
|
||||
"hevc-main10-L31-dash-cenc",
|
||||
"hevc-main10-L31-dash-cenc-prk",
|
||||
"hevc-main-L40-dash-cenc",
|
||||
"hevc-main10-L40-dash-cenc",
|
||||
"hevc-main10-L40-dash-cenc-prk",
|
||||
"hevc-main-L41-dash-cenc",
|
||||
"hevc-main10-L41-dash-cenc",
|
||||
"hevc-main10-L41-dash-cenc-prk",
|
||||
],
|
||||
"ALL": [
|
||||
"hevc-main-L30-dash-cenc",
|
||||
"hevc-main10-L30-dash-cenc",
|
||||
"hevc-main10-L30-dash-cenc-prk",
|
||||
"hevc-main-L31-dash-cenc"
|
||||
"hevc-main10-L31-dash-cenc",
|
||||
"hevc-main10-L31-dash-cenc-prk",
|
||||
"hevc-main-L40-dash-cenc",
|
||||
"hevc-main10-L40-dash-cenc",
|
||||
"hevc-main10-L40-dash-cenc-prk",
|
||||
"hevc-main-L41-dash-cenc",
|
||||
"hevc-main10-L41-dash-cenc",
|
||||
"hevc-main10-L41-dash-cenc-prk",
|
||||
],
|
||||
},
|
||||
"HEVCDO": {
|
||||
"SD": [
|
||||
"hevc-main10-L30-dash-cenc-prk-do",
|
||||
],
|
||||
"HD": [
|
||||
"hevc-main10-L30-dash-cenc-prk-do",
|
||||
"hevc-main10-L31-dash-cenc-prk-do"
|
||||
],
|
||||
"FHD": [
|
||||
"hevc-main10-L31-dash-cenc-prk-do",
|
||||
"hevc-main10-L31-dash-cenc-prk-do",
|
||||
"hevc-main10-L40-dash-cenc-prk-do",
|
||||
"hevc-main10-L41-dash-cenc-prk-do",
|
||||
],
|
||||
"ALL": [
|
||||
"hevc-main10-L31-dash-cenc-prk-do",
|
||||
"hevc-main10-L31-dash-cenc-prk-do",
|
||||
"hevc-main10-L40-dash-cenc-prk-do",
|
||||
"hevc-main10-L41-dash-cenc-prk-do",
|
||||
],
|
||||
},
|
||||
"HDR": {
|
||||
"SD": [
|
||||
"hevc-hdr-main10-L30-dash-cenc",
|
||||
"hevc-hdr-main10-L30-dash-cenc-prk",
|
||||
],
|
||||
"HD": [
|
||||
"hevc-hdr-main10-L30-dash-cenc",
|
||||
"hevc-hdr-main10-L30-dash-cenc-prk",
|
||||
"hevc-hdr-main10-L31-dash-cenc",
|
||||
"hevc-hdr-main10-L31-dash-cenc-prk",
|
||||
],
|
||||
"FHD": [
|
||||
"hevc-hdr-main10-L30-dash-cenc",
|
||||
"hevc-hdr-main10-L30-dash-cenc-prk",
|
||||
"hevc-hdr-main10-L31-dash-cenc",
|
||||
"hevc-hdr-main10-L31-dash-cenc-prk",
|
||||
"hevc-hdr-main10-L40-dash-cenc",
|
||||
"hevc-hdr-main10-L41-dash-cenc",
|
||||
"hevc-hdr-main10-L40-dash-cenc-prk",
|
||||
"hevc-hdr-main10-L41-dash-cenc-prk",
|
||||
],
|
||||
"ALL": [
|
||||
"hevc-hdr-main10-L30-dash-cenc",
|
||||
"hevc-hdr-main10-L30-dash-cenc-prk",
|
||||
"hevc-hdr-main10-L31-dash-cenc",
|
||||
"hevc-hdr-main10-L31-dash-cenc-prk",
|
||||
"hevc-hdr-main10-L40-dash-cenc",
|
||||
"hevc-hdr-main10-L41-dash-cenc",
|
||||
"hevc-hdr-main10-L40-dash-cenc-prk",
|
||||
"hevc-hdr-main10-L41-dash-cenc-prk",
|
||||
],
|
||||
},
|
||||
}
|
||||
|
||||
return PROFILES
|
||||
|
||||
class get_manifest:
|
||||
|
||||
def __init__(self, args, nfid):
|
||||
self.logger = logging.getLogger(__name__)
|
||||
self.args = args
|
||||
self.nfid = nfid
|
||||
self.ripprocess = ripprocess()
|
||||
self.profiles = MSLprofiles()
|
||||
self.config = tool().config("NETFLIX")
|
||||
|
||||
def LoadProfies(self, addHEVCDO=False):
|
||||
getHigh = False
|
||||
profiles = self.profiles["BASICS"]
|
||||
|
||||
if self.args.video_main:
|
||||
if self.args.customquality:
|
||||
if int(self.args.customquality[0]) == 1080:
|
||||
profiles += self.profiles["MAIN"]["FHD"]
|
||||
elif (
|
||||
int(self.args.customquality[0]) < 1080
|
||||
and int(self.args.customquality[0]) >= 720
|
||||
):
|
||||
profiles += self.profiles["MAIN"]["HD"]
|
||||
elif int(self.args.customquality[0]) < 720:
|
||||
profiles += self.profiles["MAIN"]["SD"]
|
||||
else:
|
||||
profiles += self.profiles["MAIN"]["ALL"]
|
||||
else:
|
||||
if self.args.video_high:
|
||||
if self.args.customquality:
|
||||
if int(self.args.customquality[0]) == 1080:
|
||||
profiles += self.profiles["HIGH"]["FHD"]
|
||||
elif (
|
||||
int(self.args.customquality[0]) < 1080
|
||||
and int(self.args.customquality[0]) >= 720
|
||||
):
|
||||
profiles += self.profiles["HIGH"]["HD"]
|
||||
elif int(self.args.customquality[0]) < 720:
|
||||
profiles += self.profiles["HIGH"]["SD"]
|
||||
else:
|
||||
profiles += self.profiles["HIGH"]["ALL"]
|
||||
else:
|
||||
if self.args.hdr:
|
||||
if self.args.customquality:
|
||||
if int(self.args.customquality[0]) == 1080:
|
||||
profiles += self.profiles["HDR"]["FHD"]
|
||||
elif (
|
||||
int(self.args.customquality[0]) < 1080
|
||||
and int(self.args.customquality[0]) >= 720
|
||||
):
|
||||
profiles += self.profiles["HDR"]["HD"]
|
||||
elif int(self.args.customquality[0]) < 720:
|
||||
profiles += self.profiles["HDR"]["SD"]
|
||||
else:
|
||||
profiles += self.profiles["HDR"]["ALL"]
|
||||
|
||||
elif self.args.hevc:
|
||||
if self.args.customquality:
|
||||
if int(self.args.customquality[0]) == 1080:
|
||||
profiles += self.profiles["HEVC"]["FHD"]
|
||||
if addHEVCDO:
|
||||
profiles += self.profiles['HEVCDO']['FHD']
|
||||
elif (
|
||||
int(self.args.customquality[0]) < 1080
|
||||
and int(self.args.customquality[0]) >= 720
|
||||
):
|
||||
profiles += self.profiles["HEVC"]["HD"]
|
||||
if addHEVCDO:
|
||||
profiles += self.profiles['HEVCDO']['HD']
|
||||
elif int(self.args.customquality[0]) < 720:
|
||||
profiles += self.profiles["HEVC"]["SD"]
|
||||
if addHEVCDO:
|
||||
profiles += self.profiles['HEVCDO']['SD']
|
||||
else:
|
||||
profiles += self.profiles["HEVC"]["ALL"]
|
||||
if addHEVCDO:
|
||||
profiles += self.profiles['HEVCDO']['ALL']
|
||||
|
||||
else:
|
||||
getHigh = True
|
||||
if self.args.customquality:
|
||||
if int(self.args.customquality[0]) == 1080:
|
||||
profiles += self.profiles["MAIN"]["FHD"]
|
||||
elif (
|
||||
int(self.args.customquality[0]) < 1080
|
||||
and int(self.args.customquality[0]) >= 720
|
||||
):
|
||||
profiles += self.profiles["MAIN"]["HD"]
|
||||
elif int(self.args.customquality[0]) < 720:
|
||||
profiles += self.profiles["MAIN"]["SD"]
|
||||
else:
|
||||
profiles += self.profiles["MAIN"]["ALL"]
|
||||
|
||||
if self.args.aformat_2ch:
|
||||
if str(self.args.aformat_2ch[0]) == "aac":
|
||||
profiles.append("heaac-2-dash")
|
||||
profiles.append("heaac-2hq-dash")
|
||||
elif str(self.args.aformat_2ch[0]) == "eac3":
|
||||
profiles.append("ddplus-2.0-dash")
|
||||
elif str(self.args.aformat_2ch[0]) == "ogg":
|
||||
profiles.append("playready-oggvorbis-2-dash")
|
||||
else:
|
||||
if self.args.only_2ch_audio:
|
||||
profiles.append("ddplus-2.0-dash")
|
||||
else:
|
||||
if self.args.aformat_51ch:
|
||||
if str(self.args.aformat_51ch[0]) == "aac":
|
||||
profiles.append("heaac-5.1-dash")
|
||||
profiles.append("heaac-5.1hq-dash")
|
||||
elif str(self.args.aformat_51ch[0]) == "eac3":
|
||||
profiles.append("ddplus-5.1-dash")
|
||||
profiles.append("ddplus-5.1hq-dash")
|
||||
elif str(self.args.aformat_51ch[0]) == "ac3":
|
||||
profiles.append("dd-5.1-dash")
|
||||
elif str(self.args.aformat_51ch[0]) == "atmos":
|
||||
profiles.append("dd-5.1-dash")
|
||||
profiles.append("ddplus-atmos-dash")
|
||||
else:
|
||||
profiles.append("dd-5.1-dash")
|
||||
profiles.append("ddplus-5.1-dash")
|
||||
profiles.append("ddplus-5.1hq-dash")
|
||||
else:
|
||||
profiles.append("ddplus-2.0-dash")
|
||||
profiles.append("dd-5.1-dash")
|
||||
profiles.append("ddplus-5.1-dash")
|
||||
profiles.append("ddplus-5.1hq-dash")
|
||||
profiles.append("ddplus-atmos-dash")
|
||||
|
||||
return list(set(profiles)), getHigh
|
||||
|
||||
def PyMSL(self, profiles):
|
||||
|
||||
client = MSLClient(profiles=profiles)
|
||||
|
||||
try:
|
||||
resp = client.load_playlist(int(self.nfid))
|
||||
return resp
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error("Manifest Error: {}".format(e))
|
||||
|
||||
return None
|
||||
|
||||
def HighVideoMSL(self):
|
||||
# for bitrate compare with main ~
|
||||
|
||||
self.logger.info("Getting High Profile Manifest...")
|
||||
|
||||
profiles = self.profiles["BASICS"]
|
||||
|
||||
if self.args.customquality:
|
||||
if int(self.args.customquality[0]) == 1080:
|
||||
profiles += self.profiles["HIGH"]["FHD"]
|
||||
elif (
|
||||
int(self.args.customquality[0]) < 1080
|
||||
and int(self.args.customquality[0]) >= 720
|
||||
):
|
||||
profiles += self.profiles["HIGH"]["HD"]
|
||||
elif int(self.args.customquality[0]) < 720:
|
||||
profiles += self.profiles["HIGH"]["SD"]
|
||||
else:
|
||||
profiles += self.profiles["HIGH"]["ALL"]
|
||||
|
||||
resp = self.PyMSL(profiles=profiles)
|
||||
|
||||
VideoList = list()
|
||||
|
||||
manifest = resp["result"]
|
||||
|
||||
for video_track in manifest["video_tracks"]:
|
||||
for downloadable in video_track["streams"]:
|
||||
size_in_bytes = int(float(downloadable["size"]))
|
||||
vid_size = (
|
||||
f"{size_in_bytes/1048576:0.2f} MiB"
|
||||
if size_in_bytes < 1073741824
|
||||
else f"{size_in_bytes/1073741824:0.2f} GiB"
|
||||
)
|
||||
vid_url = downloadable["urls"][0]["url"]
|
||||
L3 = 'L3' if 'SEGMENT_MAP_2KEY' in str(downloadable['tags']) else '' #
|
||||
|
||||
VideoList.append(
|
||||
{
|
||||
"Type": "video",
|
||||
"Drm": downloadable["isDrm"],
|
||||
"vmaf": downloadable["vmaf"],
|
||||
"FrameRate": downloadable["framerate_value"],
|
||||
"Height": downloadable["res_h"],
|
||||
"Width": downloadable["res_w"],
|
||||
"Size": vid_size,
|
||||
"Url": vid_url,
|
||||
"Bitrate": str(downloadable["bitrate"]),
|
||||
"Profile": downloadable["content_profile"],
|
||||
"L3": L3 #
|
||||
}
|
||||
)
|
||||
|
||||
VideoList = sorted(VideoList, key=lambda k: int(k["Bitrate"]))
|
||||
|
||||
if self.args.customquality:
|
||||
inp_height = int(self.args.customquality[0])
|
||||
top_height = sorted(VideoList, key=lambda k: int(k["Height"]))[-1]["Height"]
|
||||
|
||||
if top_height >= inp_height:
|
||||
height = [x for x in VideoList if int(x["Height"]) >= inp_height]
|
||||
if not height == []:
|
||||
VideoList = height
|
||||
|
||||
return VideoList
|
||||
|
||||
def ParseVideo(self, resp, getHigh):
|
||||
manifest = resp["result"]
|
||||
VideoList = []
|
||||
checkerinfo = ""
|
||||
|
||||
for video_track in manifest["video_tracks"]:
|
||||
for downloadable in video_track["streams"]:
|
||||
size_in_bytes = int(float(downloadable["size"]))
|
||||
vid_size = (
|
||||
f"{size_in_bytes/1048576:0.2f} MiB"
|
||||
if size_in_bytes < 1073741824
|
||||
else f"{size_in_bytes/1073741824:0.2f} GiB"
|
||||
)
|
||||
vid_url = downloadable["urls"][0]["url"]
|
||||
|
||||
VideoList.append(
|
||||
{
|
||||
"Type": "video",
|
||||
"Drm": downloadable["isDrm"],
|
||||
"vmaf": downloadable["vmaf"],
|
||||
"FrameRate": downloadable["framerate_value"],
|
||||
"Height": downloadable["res_h"],
|
||||
"Width": downloadable["res_w"],
|
||||
"Size": vid_size,
|
||||
"Url": vid_url,
|
||||
"Bitrate": str(downloadable["bitrate"]),
|
||||
"Profile": downloadable["content_profile"],
|
||||
}
|
||||
)
|
||||
|
||||
VideoList = sorted(VideoList, key=lambda k: int(k["Bitrate"]))
|
||||
self.logger.debug("VideoList: {}".format(VideoList))
|
||||
|
||||
if self.args.customquality:
|
||||
inp_height = int(self.args.customquality[0])
|
||||
top_height = sorted(VideoList, key=lambda k: int(k["Height"]))[-1]["Height"]
|
||||
|
||||
if top_height >= inp_height:
|
||||
height = [x for x in VideoList if int(x["Height"]) >= inp_height]
|
||||
if not height == []:
|
||||
VideoList = height
|
||||
|
||||
if getHigh:
|
||||
HighVideoList = self.HighVideoMSL()
|
||||
if not HighVideoList == []:
|
||||
checkerinfo = "\nNetflix Profile Checker v1.0\nMAIN: {}kbps | {}\nHIGH: {}kbps | {}\n\n{}\n"
|
||||
checkerinfo = checkerinfo.format(
|
||||
str(dict(VideoList[-1])["Bitrate"]),
|
||||
str(dict(VideoList[-1])["Profile"]),
|
||||
str(dict(HighVideoList[-1])["Bitrate"]),
|
||||
str(dict(HighVideoList[-1])["Profile"]),
|
||||
"result: MAIN is Better"
|
||||
if int(dict(VideoList[-1])["Bitrate"])
|
||||
>= int(dict(HighVideoList[-1])["Bitrate"])
|
||||
else "result: HIGH is Better",
|
||||
)
|
||||
|
||||
VideoList += HighVideoList
|
||||
self.logger.debug("HighVideoList: {}".format(HighVideoList))
|
||||
|
||||
VideoList = sorted(VideoList, key=lambda k: int(k["Bitrate"]))
|
||||
|
||||
return VideoList, checkerinfo
|
||||
|
||||
def ParseAudioSubs(self, resp):
|
||||
|
||||
def remove_dups(List, keyword=""):
|
||||
# function to remove all dups based on list items ~
|
||||
Added_ = set()
|
||||
Proper_ = []
|
||||
for L in List:
|
||||
if L[keyword] not in Added_:
|
||||
Proper_.append(L)
|
||||
Added_.add(L[keyword])
|
||||
|
||||
return Proper_
|
||||
|
||||
def isOriginal(language_text):
|
||||
# function to detect the original audio ~
|
||||
if "Original" in language_text:
|
||||
return True
|
||||
|
||||
brackets = re.search(r"\[(.*)\]", language_text)
|
||||
if brackets:
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def noOriginal(language_text):
|
||||
# function to remove (original) from audio language to be detected in --alang ~
|
||||
brackets = re.search(r"\[(.*)\]", language_text)
|
||||
if brackets:
|
||||
return language_text.replace(brackets[0], "").strip()
|
||||
|
||||
return language_text
|
||||
|
||||
# start audio, subs parsing ~
|
||||
|
||||
manifest = resp["result"]
|
||||
|
||||
AudioList, SubtitleList, ForcedList = list(), list(), list()
|
||||
|
||||
# parse audios and return all (AD, non AD) as a list
|
||||
for audio_track in manifest["audio_tracks"]:
|
||||
AudioDescription = 'Audio Description' if "audio description" in \
|
||||
audio_track["languageDescription"].lower() else 'Audio'
|
||||
Original = isOriginal(audio_track["languageDescription"])
|
||||
LanguageName, LanguageCode = self.ripprocess.countrycode(
|
||||
audio_track["language"]
|
||||
)
|
||||
LanguageName = noOriginal(audio_track["languageDescription"])
|
||||
|
||||
for downloadable in audio_track["streams"]:
|
||||
aud_url = downloadable["urls"][0]["url"]
|
||||
size = (
|
||||
str(format(float(int(downloadable["size"])) / 1058816, ".2f"))
|
||||
+ " MiB"
|
||||
)
|
||||
|
||||
audioDict = {
|
||||
"Type": AudioDescription,
|
||||
"Drm": downloadable["isDrm"],
|
||||
"Original": Original,
|
||||
"Language": LanguageName,
|
||||
"langAbbrev": LanguageCode,
|
||||
"Size": size,
|
||||
"Url": aud_url,
|
||||
"channels": str(downloadable["channels"]),
|
||||
"Bitrate": str(downloadable["bitrate"]),
|
||||
"Profile": downloadable["content_profile"],
|
||||
}
|
||||
|
||||
if self.args.custom_audio_bitrate:
|
||||
# only append the audio langs with the given bitrate
|
||||
if int(downloadable["bitrate"]) <= \
|
||||
int(self.args.custom_audio_bitrate[0]):
|
||||
AudioList.append(audioDict)
|
||||
else:
|
||||
AudioList.append(audioDict)
|
||||
|
||||
AudioList = sorted(AudioList, key=lambda k: int(k["Bitrate"]), reverse=True)
|
||||
|
||||
self.logger.debug("AudioList: {}".format(AudioList))
|
||||
|
||||
#################################################################################
|
||||
|
||||
AudioList = sorted( # keep only highest bitrate for every language
|
||||
remove_dups(AudioList, keyword="Language"),
|
||||
key=lambda k: int(k["Bitrate"]),
|
||||
reverse=True,
|
||||
)
|
||||
|
||||
OriginalAudioList = ( # for detect automatically forced subs ~
|
||||
AudioList
|
||||
if len(AudioList) == 1
|
||||
else [x for x in AudioList if x["Original"]]
|
||||
)
|
||||
|
||||
#################################################################################
|
||||
|
||||
# now parser AudioList based on user input to
|
||||
# --alang X X --AD X X or original if none
|
||||
|
||||
if self.args.AD:
|
||||
ADlist = list()
|
||||
UserLanguagesLower = list(map(lambda x: x.lower(), self.args.AD))
|
||||
for aud in AudioList:
|
||||
if aud['Type'] == 'Audio':
|
||||
if self.args.allaudios:
|
||||
ADlist.append(aud)
|
||||
else:
|
||||
if aud["Original"]:
|
||||
ADlist.append(aud)
|
||||
|
||||
if aud['Type'] == 'Audio Description':
|
||||
if (
|
||||
aud["Language"].lower() in UserLanguagesLower
|
||||
or aud["langAbbrev"].lower() in UserLanguagesLower
|
||||
):
|
||||
ADlist.append(aud)
|
||||
|
||||
AudioList = ADlist
|
||||
|
||||
if self.args.audiolang:
|
||||
NewAudioList = list()
|
||||
UserLanguagesLower = list(map(lambda x: x.lower(), self.args.audiolang))
|
||||
for aud in AudioList:
|
||||
if self.args.AD:
|
||||
# I already have AD langs parsed
|
||||
if aud['Type'] == 'Audio Description':
|
||||
NewAudioList.append(aud)
|
||||
if aud['Type'] == 'Audio':
|
||||
if (
|
||||
aud["Language"].lower() in UserLanguagesLower
|
||||
or aud["langAbbrev"].lower() in UserLanguagesLower
|
||||
):
|
||||
NewAudioList.append(aud)
|
||||
|
||||
AudioList = NewAudioList
|
||||
|
||||
else:
|
||||
# so I know have the complete Audiolist
|
||||
if self.args.allaudios: # remove AD tracks if not --AD X X
|
||||
AllaudiosList = list()
|
||||
if self.args.AD:
|
||||
for aud in AudioList:
|
||||
AllaudiosList.append(aud)
|
||||
AudioList = AllaudiosList
|
||||
else:
|
||||
for aud in AudioList:
|
||||
if aud['Type'] == 'Audio':
|
||||
AllaudiosList.append(aud)
|
||||
AudioList.clear()
|
||||
AudioList = AllaudiosList
|
||||
|
||||
else:
|
||||
if self.args.AD:
|
||||
AudioList = AudioList # I mean the ADlist
|
||||
else:
|
||||
# I mean no audio options are given, so we go with the original
|
||||
AudioList = [x for x in AudioList if x["Original"] or len(AudioList) == 1]
|
||||
|
||||
#####################################(Subtitles)#####################################
|
||||
|
||||
for text_track in manifest["timedtexttracks"]:
|
||||
if (
|
||||
not text_track["languageDescription"] == "Off"
|
||||
and text_track["language"] is not None
|
||||
):
|
||||
Language, langAbbrev = self.ripprocess.countrycode(
|
||||
text_track["language"]
|
||||
)
|
||||
Language = text_track["languageDescription"]
|
||||
Type = text_track["trackType"]
|
||||
rawTrackType = (
|
||||
text_track["rawTrackType"]
|
||||
.replace("closedcaptions", "CC")
|
||||
.replace("subtitles", "SUB")
|
||||
)
|
||||
isForced = "NO"
|
||||
|
||||
if (
|
||||
"CC" in rawTrackType
|
||||
and langAbbrev != "ara"
|
||||
and "dfxp-ls-sdh" in str(text_track["ttDownloadables"])
|
||||
):
|
||||
Profile = "dfxp-ls-sdh"
|
||||
Url = next(
|
||||
iter(
|
||||
text_track["ttDownloadables"]["dfxp-ls-sdh"][
|
||||
"downloadUrls"
|
||||
].values()
|
||||
)
|
||||
)
|
||||
else:
|
||||
Profile = "webvtt-lssdh-ios8"
|
||||
Url = next(
|
||||
iter(
|
||||
text_track["ttDownloadables"]["webvtt-lssdh-ios8"][
|
||||
"downloadUrls"
|
||||
].values()
|
||||
)
|
||||
)
|
||||
|
||||
SubtitleList.append(
|
||||
{
|
||||
"Type": Type,
|
||||
"rawTrackType": rawTrackType,
|
||||
"Language": Language,
|
||||
"isForced": isForced,
|
||||
"langAbbrev": langAbbrev,
|
||||
"Url": Url,
|
||||
"Profile": Profile,
|
||||
}
|
||||
)
|
||||
|
||||
self.logger.debug("SubtitleList: {}".format(SubtitleList))
|
||||
SubtitleList = remove_dups(SubtitleList, keyword="Language")
|
||||
|
||||
if self.args.sublang:
|
||||
NewSubtitleList = list()
|
||||
UserLanguagesLower = list(map(lambda x: x.lower(), self.args.sublang))
|
||||
for sub in SubtitleList:
|
||||
if (
|
||||
sub["Language"].lower() in UserLanguagesLower
|
||||
or sub["langAbbrev"].lower() in UserLanguagesLower
|
||||
):
|
||||
NewSubtitleList.append(sub)
|
||||
SubtitleList = remove_dups(NewSubtitleList, keyword="Language")
|
||||
|
||||
#####################################(Forced Subtitles)###############################
|
||||
|
||||
for text_track in manifest["timedtexttracks"]:
|
||||
if text_track["isForcedNarrative"] and text_track["language"] is not None:
|
||||
LanguageName, LanguageCode = self.ripprocess.countrycode(
|
||||
text_track["language"]
|
||||
)
|
||||
# LanguageName = text_track["languageDescription"] # no i will use pycountry instead bcs it's off dude.
|
||||
ForcedList.append(
|
||||
{
|
||||
"Type": text_track["trackType"],
|
||||
"rawTrackType": text_track["rawTrackType"]
|
||||
.replace("closedcaptions", "CC ")
|
||||
.replace("subtitles", "SUB"),
|
||||
"Language": LanguageName,
|
||||
"isForced": "YES",
|
||||
"langAbbrev": LanguageCode,
|
||||
"Url": next(
|
||||
iter(
|
||||
text_track["ttDownloadables"]["webvtt-lssdh-ios8"][
|
||||
"downloadUrls"
|
||||
].values()
|
||||
)
|
||||
),
|
||||
"Profile": "webvtt-lssdh-ios8",
|
||||
}
|
||||
)
|
||||
|
||||
ForcedList = remove_dups(ForcedList, keyword="Language")
|
||||
|
||||
if self.args.forcedlang:
|
||||
NewForcedList = []
|
||||
UserLanguagesLower = list(map(lambda x: x.lower(), self.args.forcedlang))
|
||||
for sub in ForcedList:
|
||||
if (
|
||||
sub["Language"].lower() in UserLanguagesLower
|
||||
or sub["langAbbrev"].lower() in UserLanguagesLower
|
||||
):
|
||||
NewForcedList.append(sub)
|
||||
ForcedList = remove_dups(NewForcedList, keyword="Language")
|
||||
else:
|
||||
if not self.args.allforcedlang:
|
||||
if len(OriginalAudioList) != 0:
|
||||
OriginalLanguage = OriginalAudioList[0]["langAbbrev"]
|
||||
ForcedList = [
|
||||
x for x in ForcedList if x["langAbbrev"] == OriginalLanguage
|
||||
]
|
||||
|
||||
return AudioList, SubtitleList, ForcedList
|
||||
|
||||
def LoadManifest(self):
|
||||
|
||||
profiles, getHigh = self.LoadProfies()
|
||||
|
||||
if self.args.hevc:
|
||||
self.logger.info("Getting HEVC Manifest...")
|
||||
elif self.args.hdr:
|
||||
self.logger.info("Getting HDR-10 Manifest...")
|
||||
elif self.args.video_high:
|
||||
self.logger.info("Getting High Profile Manifest...")
|
||||
else:
|
||||
self.logger.info("Getting Main Profile Manifest...")
|
||||
|
||||
resp = self.PyMSL(profiles=profiles)
|
||||
|
||||
if not resp:
|
||||
if self.args.hevc:
|
||||
profiles, getHigh = self.LoadProfies(addHEVCDO=True)
|
||||
self.logger.info('\nGetting HEVC DO Manifest...')
|
||||
resp = self.PyMSL(profiles=profiles)
|
||||
|
||||
if not resp:
|
||||
self.logger.info("Failed getting Manifest")
|
||||
exit(-1)
|
||||
|
||||
VideoList, checkerinfo = self.ParseVideo(resp, getHigh)
|
||||
AudioList, SubtitleList, ForcedList = self.ParseAudioSubs(resp)
|
||||
|
||||
return VideoList, AudioList, SubtitleList, ForcedList, checkerinfo
|
0
helpers/__init__.py
Normal file
0
helpers/__init__.py
Normal file
369
helpers/aria2.py
Normal file
369
helpers/aria2.py
Normal file
@ -0,0 +1,369 @@
|
||||
import os
|
||||
import shutil
|
||||
import subprocess
|
||||
import sys
|
||||
import re
|
||||
import logging
|
||||
from configs.config import tool
|
||||
from helpers.ripprocess import ripprocess
|
||||
|
||||
|
||||
class aria2Error(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class aria2_moded:
|
||||
def __init__(self, aria2_download_command):
|
||||
self.logger = logging.getLogger(__name__)
|
||||
self.aria2_download_command = aria2_download_command
|
||||
self.env = self.aria2DisableProxies()
|
||||
self.ripprocess = ripprocess()
|
||||
self.tool = tool()
|
||||
self.LOGA_PATH = self.tool.paths()["LOGA_PATH"]
|
||||
self.bin = self.tool.bin()
|
||||
self.aria2c_exe = self.bin["aria2c"]
|
||||
self.last_message_printed = 0
|
||||
self.speed_radar = "0kbps"
|
||||
|
||||
def aria2DisableProxies(self):
|
||||
env = os.environ.copy()
|
||||
|
||||
if env.get("http_proxy"):
|
||||
del env["http_proxy"]
|
||||
|
||||
if env.get("HTTP_PROXY"):
|
||||
del env["HTTP_PROXY"]
|
||||
|
||||
if env.get("https_proxy"):
|
||||
del env["https_proxy"]
|
||||
|
||||
if env.get("HTTPS_PROXY"):
|
||||
del env["HTTPS_PROXY"]
|
||||
|
||||
return env
|
||||
|
||||
def read_stdout(self, line):
|
||||
speed = re.search(r"DL:(.+?)ETA", line)
|
||||
eta = re.search(r"ETA:(.+?)]", line)
|
||||
connection = re.search(r"CN:(.+?)DL", line)
|
||||
percent = re.search(r"\((.*?)\)", line)
|
||||
size = re.search(r" (.*?)/(.*?)\(", line)
|
||||
|
||||
if speed and eta and connection and percent and size:
|
||||
percent = percent.group().strip().replace(")", "").replace("(", "")
|
||||
size = size.group().strip().replace(")", "").replace("(", "")
|
||||
complete, total = size.split("/")
|
||||
connection = connection.group(1).strip()
|
||||
eta = eta.group(1).strip()
|
||||
speed = speed.group(1).strip()
|
||||
self.speed_radar = speed
|
||||
|
||||
stdout_data = {
|
||||
"percent": str(percent),
|
||||
"size": str(total),
|
||||
"complete": str(complete),
|
||||
"total": str(total),
|
||||
"connection": str(connection),
|
||||
"eta": str(eta),
|
||||
"speed": str(speed),
|
||||
}
|
||||
|
||||
return stdout_data
|
||||
|
||||
return None
|
||||
|
||||
def if_errors(self, line):
|
||||
if "exception" in str(line).lower() or "errorcode" in str(line).lower():
|
||||
return line
|
||||
return None
|
||||
|
||||
def delete_last_message_printed(self):
|
||||
print(" " * len(str(self.last_message_printed)), end="\r")
|
||||
|
||||
def get_status(self, stdout_data: dict):
|
||||
return "Aria2c_Status; Size: {Size} | Speed: {Speed} | ETA: {ETA} | Progress: {Complete} -> {Total} ({Percent})".format(
|
||||
Size=stdout_data.get("size"),
|
||||
Speed=stdout_data.get("speed"),
|
||||
ETA=stdout_data.get("eta"),
|
||||
Complete=stdout_data.get("complete"),
|
||||
Total=stdout_data.get("total"),
|
||||
Percent=stdout_data.get("percent"),
|
||||
)
|
||||
|
||||
def is_download_completed(self, line):
|
||||
if "(ok):download completed." in str(line).lower():
|
||||
return "Download completed: (OK) ({}\\s)".format(self.speed_radar)
|
||||
return None
|
||||
|
||||
def start_download(self):
|
||||
proc = subprocess.Popen(
|
||||
self.aria2_download_command,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.STDOUT,
|
||||
bufsize=1,
|
||||
universal_newlines=True,
|
||||
env=self.env,
|
||||
)
|
||||
|
||||
check_errors = True
|
||||
for line in getattr(proc, "stdout"):
|
||||
if check_errors:
|
||||
if self.if_errors(line):
|
||||
raise aria2Error("Aria2c Error {}".format(self.if_errors(line)))
|
||||
check_errors = False
|
||||
stdout_data = self.read_stdout(line)
|
||||
if stdout_data:
|
||||
status_text = self.get_status(stdout_data)
|
||||
self.delete_last_message_printed()
|
||||
print(status_text, end="\r", flush=True)
|
||||
self.last_message_printed = status_text
|
||||
else:
|
||||
download_finished = self.is_download_completed(line)
|
||||
if download_finished:
|
||||
self.delete_last_message_printed()
|
||||
print(download_finished, end="\r", flush=True)
|
||||
self.last_message_printed = download_finished
|
||||
self.logger.info("")
|
||||
return
|
||||
|
||||
|
||||
class aria2:
|
||||
def __init__(self,):
|
||||
self.env = self.aria2DisableProxies()
|
||||
self.ripprocess = ripprocess()
|
||||
self.tool = tool()
|
||||
self.bin = self.tool.bin()
|
||||
self.LOGA_PATH = self.tool.paths()["LOGA_PATH"]
|
||||
self.config = self.tool.aria2c()
|
||||
self.aria2c_exe = self.bin["aria2c"]
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
def convert_args(self, arg):
|
||||
if arg is True:
|
||||
return "true"
|
||||
elif arg is False:
|
||||
return "false"
|
||||
elif arg is None:
|
||||
return "none"
|
||||
else:
|
||||
return str(arg)
|
||||
|
||||
def append_commands(self, command, option_define, option):
|
||||
if option == "skip":
|
||||
return []
|
||||
|
||||
return ["{}{}".format(option_define, option)]
|
||||
|
||||
def append_two_commands(self, command, cmd1, cmd2):
|
||||
if cmd2 == "skip":
|
||||
return []
|
||||
|
||||
return [cmd1] + [cmd2]
|
||||
|
||||
def aria2Options(
|
||||
self,
|
||||
allow_overwrite=True,
|
||||
file_allocation=None,
|
||||
auto_file_renaming=False,
|
||||
async_dns=False,
|
||||
retry_wait=5,
|
||||
summary_interval=0,
|
||||
enable_color=False,
|
||||
connection=16,
|
||||
concurrent_downloads=16,
|
||||
split=16,
|
||||
header="skip",
|
||||
user_agent="Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.61 Safari/537.36",
|
||||
uri_selector="inorder",
|
||||
console_log_level="skip",
|
||||
download_result="hide",
|
||||
quiet="false",
|
||||
extra_commands=[],
|
||||
):
|
||||
|
||||
options = [] + extra_commands
|
||||
allow_overwrite = self.convert_args(allow_overwrite)
|
||||
quiet = self.convert_args(quiet)
|
||||
file_allocation = self.convert_args(file_allocation)
|
||||
auto_file_renaming = self.convert_args(auto_file_renaming)
|
||||
async_dns = self.convert_args(async_dns)
|
||||
retry_wait = self.convert_args(retry_wait)
|
||||
enable_color = self.convert_args(enable_color)
|
||||
connection = self.convert_args(connection)
|
||||
concurrent_downloads = self.convert_args(concurrent_downloads)
|
||||
split = self.convert_args(split)
|
||||
header = self.convert_args(header)
|
||||
uri_selector = self.convert_args(uri_selector)
|
||||
console_log_level = self.convert_args(console_log_level)
|
||||
download_result = self.convert_args(download_result)
|
||||
|
||||
##############################################################################
|
||||
|
||||
options += self.append_commands(options, "--allow-overwrite=", allow_overwrite)
|
||||
options += self.append_commands(options, "--quiet=", quiet)
|
||||
options += self.append_commands(options, "--file-allocation=", file_allocation)
|
||||
options += self.append_commands(
|
||||
options, "--auto-file-renaming=", auto_file_renaming
|
||||
)
|
||||
options += self.append_commands(options, "--async-dns=", async_dns)
|
||||
options += self.append_commands(options, "--retry-wait=", retry_wait)
|
||||
options += self.append_commands(options, "--enable-color=", enable_color)
|
||||
|
||||
options += self.append_commands(
|
||||
options, "--max-connection-per-server=", connection
|
||||
)
|
||||
|
||||
options += self.append_commands(
|
||||
options, "--max-concurrent-downloads=", concurrent_downloads
|
||||
)
|
||||
options += self.append_commands(options, "--split=", split)
|
||||
options += self.append_commands(options, "--header=", header)
|
||||
options += self.append_commands(options, "--uri-selector=", uri_selector)
|
||||
options += self.append_commands(
|
||||
options, "--console-log-level=", console_log_level
|
||||
)
|
||||
options += self.append_commands(options, "--download-result=", download_result)
|
||||
|
||||
return options
|
||||
|
||||
def aria2DisableProxies(self):
|
||||
env = os.environ.copy()
|
||||
|
||||
if env.get("http_proxy"):
|
||||
del env["http_proxy"]
|
||||
|
||||
if env.get("HTTP_PROXY"):
|
||||
del env["HTTP_PROXY"]
|
||||
|
||||
if env.get("https_proxy"):
|
||||
del env["https_proxy"]
|
||||
|
||||
if env.get("HTTPS_PROXY"):
|
||||
del env["HTTPS_PROXY"]
|
||||
|
||||
return env
|
||||
|
||||
def aria2DownloadUrl(self, url, output, options, debug=False, moded=False):
|
||||
self.debug = debug
|
||||
aria2_download_command = [self.aria2c_exe] + options
|
||||
|
||||
if self.config["enable_logging"]:
|
||||
LogFile = os.path.join(self.LOGA_PATH, output.replace(".mp4", ".log"))
|
||||
if os.path.isfile(LogFile):
|
||||
os.remove(LogFile)
|
||||
aria2_download_command.append("--log={}".format(LogFile))
|
||||
|
||||
if not url.startswith("http"):
|
||||
raise aria2Error("Url does not start with http/https: {}".format(url))
|
||||
|
||||
aria2_download_command.append(url)
|
||||
aria2_download_command += self.append_two_commands(
|
||||
aria2_download_command, "-o", output
|
||||
)
|
||||
|
||||
self.aria2Debug("Sending Commands to aria2c...")
|
||||
self.aria2Debug(aria2_download_command)
|
||||
self.logger.debug("aria2_download_command: {}".format(aria2_download_command))
|
||||
|
||||
if moded:
|
||||
aria2_moded_download = aria2_moded(aria2_download_command)
|
||||
aria2_moded_download.start_download()
|
||||
else:
|
||||
try:
|
||||
aria = subprocess.call(aria2_download_command, env=self.env)
|
||||
except FileNotFoundError:
|
||||
self.logger.info("UNABLE TO FIND {}".format(self.aria2c_exe))
|
||||
exit(-1)
|
||||
if aria != 0:
|
||||
raise aria2Error("Aria2c exited with code {}".format(aria))
|
||||
|
||||
return
|
||||
|
||||
def aria2DownloadDash(
|
||||
self, segments, output, options, debug=False, moded=False, fixbytes=False
|
||||
):
|
||||
self.debug = debug
|
||||
aria2_download_command = [self.aria2c_exe] + options
|
||||
|
||||
if self.config["enable_logging"]:
|
||||
LogFile = os.path.join(self.LOGA_PATH, output.replace(".mp4", ".log"))
|
||||
if os.path.isfile(LogFile):
|
||||
os.remove(LogFile)
|
||||
aria2_download_command.append("--log={}".format(LogFile))
|
||||
|
||||
if not isinstance(segments, list) or segments == []:
|
||||
raise aria2Error("invalid list of urls: {}".format(segments))
|
||||
|
||||
if moded:
|
||||
raise aria2Error("moded version not supported for dash downloads atm...")
|
||||
|
||||
txt = output.replace(".mp4", ".txt")
|
||||
folder = output.replace(".mp4", "")
|
||||
segments = list(dict.fromkeys(segments))
|
||||
|
||||
if os.path.exists(folder):
|
||||
shutil.rmtree(folder)
|
||||
if not os.path.exists(folder):
|
||||
os.makedirs(folder)
|
||||
|
||||
segments_location = []
|
||||
|
||||
opened_txt = open(txt, "w+")
|
||||
for num, url in enumerate(segments, start=1):
|
||||
segment_name = str(num).zfill(5) + ".mp4"
|
||||
segments_location.append(os.path.join(*[os.getcwd(), folder, segment_name]))
|
||||
opened_txt.write(url + f"\n out={segment_name}" + f"\n dir={folder}" + "\n")
|
||||
opened_txt.close()
|
||||
|
||||
aria2_download_command += self.append_commands(
|
||||
aria2_download_command, "--input-file=", txt
|
||||
)
|
||||
|
||||
try:
|
||||
aria = subprocess.call(aria2_download_command, env=self.env)
|
||||
except FileNotFoundError:
|
||||
self.logger.info("UNABLE TO FIND {}".format(self.aria2c_exe))
|
||||
exit(-1)
|
||||
if aria != 0:
|
||||
raise aria2Error("Aria2c exited with code {}".format(aria))
|
||||
|
||||
self.logger.info("\nJoining files...")
|
||||
openfile = open(output, "wb")
|
||||
total = int(len(segments_location))
|
||||
for current, fragment in enumerate(segments_location):
|
||||
if os.path.isfile(fragment):
|
||||
if fixbytes:
|
||||
with open(fragment, "rb") as f:
|
||||
wvdll = f.read()
|
||||
if (
|
||||
re.search(
|
||||
b"tfhd\x00\x02\x00\x1a\x00\x00\x00\x01\x00\x00\x00\x02",
|
||||
wvdll,
|
||||
re.MULTILINE | re.DOTALL,
|
||||
)
|
||||
is not None
|
||||
):
|
||||
fw = open(fragment, "wb")
|
||||
m = re.search(
|
||||
b"tfhd\x00\x02\x00\x1a\x00\x00\x00\x01\x00\x00\x00",
|
||||
wvdll,
|
||||
re.MULTILINE | re.DOTALL,
|
||||
)
|
||||
segment_fixed = (
|
||||
wvdll[: m.end()] + b"\x01" + wvdll[m.end() + 1 :]
|
||||
)
|
||||
fw.write(segment_fixed)
|
||||
fw.close()
|
||||
shutil.copyfileobj(open(fragment, "rb"), openfile)
|
||||
os.remove(fragment)
|
||||
self.ripprocess.updt(total, current + 1)
|
||||
openfile.close()
|
||||
|
||||
if os.path.isfile(txt):
|
||||
os.remove(txt)
|
||||
if os.path.exists(folder):
|
||||
shutil.rmtree(folder)
|
||||
|
||||
def aria2Debug(self, txt):
|
||||
if self.debug:
|
||||
self.logger.info(txt)
|
BIN
helpers/bin/BeHappy/plugins32/LSMASHSource.dll
Normal file
BIN
helpers/bin/BeHappy/plugins32/LSMASHSource.dll
Normal file
Binary file not shown.
BIN
helpers/bin/BeHappy/plugins32/TimeStretch.dll
Normal file
BIN
helpers/bin/BeHappy/plugins32/TimeStretch.dll
Normal file
Binary file not shown.
BIN
helpers/bin/tools/MediaInfo.exe
Normal file
BIN
helpers/bin/tools/MediaInfo.exe
Normal file
Binary file not shown.
BIN
helpers/bin/tools/aria2c.exe
Normal file
BIN
helpers/bin/tools/aria2c.exe
Normal file
Binary file not shown.
BIN
helpers/bin/tools/ffmpeg.exe
Normal file
BIN
helpers/bin/tools/ffmpeg.exe
Normal file
Binary file not shown.
BIN
helpers/bin/tools/ffplay.exe
Normal file
BIN
helpers/bin/tools/ffplay.exe
Normal file
Binary file not shown.
BIN
helpers/bin/tools/ffprobe.exe
Normal file
BIN
helpers/bin/tools/ffprobe.exe
Normal file
Binary file not shown.
BIN
helpers/bin/tools/mkvmerge.exe
Normal file
BIN
helpers/bin/tools/mkvmerge.exe
Normal file
Binary file not shown.
BIN
helpers/bin/tools/mp4decrypt.exe
Normal file
BIN
helpers/bin/tools/mp4decrypt.exe
Normal file
Binary file not shown.
BIN
helpers/bin/tools/mp4dump.exe
Normal file
BIN
helpers/bin/tools/mp4dump.exe
Normal file
Binary file not shown.
116
helpers/dfxp_to_srt.py
Normal file
116
helpers/dfxp_to_srt.py
Normal file
@ -0,0 +1,116 @@
|
||||
import codecs
|
||||
import math
|
||||
import os
|
||||
import re
|
||||
|
||||
|
||||
class dfxp_to_srt:
|
||||
def __init__(self):
|
||||
self.__replace__ = "empty_line"
|
||||
|
||||
def leading_zeros(self, value, digits=2):
|
||||
value = "000000" + str(value)
|
||||
return value[-digits:]
|
||||
|
||||
def convert_time(self, raw_time):
|
||||
if int(raw_time) == 0:
|
||||
return "{}:{}:{},{}".format(0, 0, 0, 0)
|
||||
|
||||
ms = "000"
|
||||
if len(raw_time) > 4:
|
||||
ms = self.leading_zeros(int(raw_time[:-4]) % 1000, 3)
|
||||
time_in_seconds = int(raw_time[:-7]) if len(raw_time) > 7 else 0
|
||||
second = self.leading_zeros(time_in_seconds % 60)
|
||||
minute = self.leading_zeros(int(math.floor(time_in_seconds / 60)) % 60)
|
||||
hour = self.leading_zeros(int(math.floor(time_in_seconds / 3600)))
|
||||
return "{}:{}:{},{}".format(hour, minute, second, ms)
|
||||
|
||||
def xml_id_display_align_before(self, text):
|
||||
|
||||
align_before_re = re.compile(
|
||||
u'<region.*tts:displayAlign="before".*xml:id="(.*)"/>'
|
||||
)
|
||||
has_align_before = re.search(align_before_re, text)
|
||||
if has_align_before:
|
||||
return has_align_before.group(1)
|
||||
return u""
|
||||
|
||||
def xml_to_srt(self, text):
|
||||
def append_subs(start, end, prev_content, format_time):
|
||||
subs.append(
|
||||
{
|
||||
"start_time": self.convert_time(start) if format_time else start,
|
||||
"end_time": self.convert_time(end) if format_time else end,
|
||||
"content": u"\n".join(prev_content),
|
||||
}
|
||||
)
|
||||
|
||||
display_align_before = self.xml_id_display_align_before(text)
|
||||
begin_re = re.compile(u"\s*<p begin=")
|
||||
sub_lines = (l for l in text.split("\n") if re.search(begin_re, l))
|
||||
subs = []
|
||||
prev_time = {"start": 0, "end": 0}
|
||||
prev_content = []
|
||||
start = end = ""
|
||||
start_re = re.compile(u'begin\="([0-9:\.]*)')
|
||||
end_re = re.compile(u'end\="([0-9:\.]*)')
|
||||
content_re = re.compile(u'">(.*)</p>')
|
||||
|
||||
# span tags are only used for italics, so we'll get rid of them
|
||||
# and replace them by <i> and </i>, which is the standard for .srt files
|
||||
span_start_re = re.compile(u'(<span style="[a-zA-Z0-9_.]+">)+')
|
||||
span_end_re = re.compile(u"(</span>)+")
|
||||
br_re = re.compile(u"(<br\s*\/?>)+")
|
||||
fmt_t = True
|
||||
for s in sub_lines:
|
||||
span_start_tags = re.search(span_start_re, s)
|
||||
if span_start_tags:
|
||||
s = u"<i>".join(s.split(span_start_tags.group()))
|
||||
string_region_re = (
|
||||
r'<p(.*region="' + display_align_before + r'".*")>(.*)</p>'
|
||||
)
|
||||
s = re.sub(string_region_re, r"<p\1>{\\an8}\2</p>", s)
|
||||
content = re.search(content_re, s).group(1)
|
||||
|
||||
br_tags = re.search(br_re, content)
|
||||
if br_tags:
|
||||
content = u"\n".join(content.split(br_tags.group()))
|
||||
|
||||
span_end_tags = re.search(span_end_re, content)
|
||||
if span_end_tags:
|
||||
content = u"</i>".join(content.split(span_end_tags.group()))
|
||||
|
||||
prev_start = prev_time["start"]
|
||||
start = re.search(start_re, s).group(1)
|
||||
end = re.search(end_re, s).group(1)
|
||||
if len(start.split(":")) > 1:
|
||||
fmt_t = False
|
||||
start = start.replace(".", ",")
|
||||
end = end.replace(".", ",")
|
||||
if (prev_start == start and prev_time["end"] == end) or not prev_start:
|
||||
# Fix for multiple lines starting at the same time
|
||||
prev_time = {"start": start, "end": end}
|
||||
prev_content.append(content)
|
||||
continue
|
||||
append_subs(prev_time["start"], prev_time["end"], prev_content, fmt_t)
|
||||
prev_time = {"start": start, "end": end}
|
||||
prev_content = [content]
|
||||
append_subs(start, end, prev_content, fmt_t)
|
||||
|
||||
lines = (
|
||||
u"{}\n{} --> {}\n{}\n".format(
|
||||
s + 1, subs[s]["start_time"], subs[s]["end_time"], subs[s]["content"]
|
||||
)
|
||||
for s in range(len(subs))
|
||||
)
|
||||
return u"\n".join(lines)
|
||||
|
||||
def convert(self, Input, Output):
|
||||
|
||||
with codecs.open(Input, "rb", "utf-8") as f:
|
||||
text = f.read()
|
||||
|
||||
with codecs.open(Output, "wb", "utf-8") as f:
|
||||
f.write(self.xml_to_srt(text))
|
||||
|
||||
return
|
76
helpers/keyloader.py
Normal file
76
helpers/keyloader.py
Normal file
@ -0,0 +1,76 @@
|
||||
import os, json, sys
|
||||
from helpers.ripprocess import ripprocess
|
||||
|
||||
|
||||
class keysaver:
|
||||
def __init__(self, **kwargs):
|
||||
self.keys_file = kwargs.get("keys_file", None)
|
||||
self.stored = self.get_stored()
|
||||
|
||||
def read_(self):
|
||||
with open(self.keys_file, "r") as fr:
|
||||
return json.load(fr)
|
||||
|
||||
def write_(self, data):
|
||||
with open(self.keys_file, "w") as fr:
|
||||
fr.write(json.dumps(data, indent=4))
|
||||
|
||||
def get_stored(self):
|
||||
stored = []
|
||||
if os.path.isfile(self.keys_file):
|
||||
return self.read_()
|
||||
return stored
|
||||
|
||||
def formatting(self, keys_list, pssh, name):
|
||||
return [
|
||||
{
|
||||
"NAME": name,
|
||||
"PSSH": pssh,
|
||||
"ID": idx,
|
||||
"KID": key.split(":")[0],
|
||||
"KEY": key.split(":")[1],
|
||||
}
|
||||
for idx, key in enumerate(keys_list, start=1)
|
||||
]
|
||||
|
||||
def dump_keys(self, keys, pssh=None, name=None):
|
||||
old_keys = list(self.stored)
|
||||
new_keys = list(self.formatting(keys, pssh, name))
|
||||
self.write_(old_keys + new_keys)
|
||||
self.stored = self.get_stored() # to update stored keys
|
||||
|
||||
return new_keys
|
||||
|
||||
def get_key_by_pssh(self, pssh):
|
||||
keys = []
|
||||
added = set()
|
||||
for key in self.get_stored(): # read file again...
|
||||
if key["PSSH"]:
|
||||
if not key["KEY"] in added and pssh in key["PSSH"]:
|
||||
keys.append(key)
|
||||
added.add(key["KEY"])
|
||||
|
||||
return keys
|
||||
|
||||
def get_key_by_kid(self, kid):
|
||||
keys = []
|
||||
added = set()
|
||||
for key in self.get_stored(): # read file again...
|
||||
if not key["KEY"] in added and key["KID"] == kid:
|
||||
keys.append(key)
|
||||
added.add(key["KEY"])
|
||||
|
||||
return keys
|
||||
|
||||
def generate_kid(self, encrypted_file):
|
||||
return ripprocess().getKeyId(encrypted_file)
|
||||
|
||||
def set_keys(self, keys, no_kid=False):
|
||||
command_keys = []
|
||||
for key in keys:
|
||||
command_keys.append("--key")
|
||||
command_keys.append(
|
||||
"{}:{}".format(key["ID"] if no_kid else key["KID"], key["KEY"])
|
||||
)
|
||||
|
||||
return command_keys
|
112
helpers/proxy_environ.py
Normal file
112
helpers/proxy_environ.py
Normal file
@ -0,0 +1,112 @@
|
||||
import os
|
||||
import requests
|
||||
import sys, json
|
||||
import random
|
||||
from configs.config import tool
|
||||
from helpers.vpn import connect
|
||||
import logging
|
||||
|
||||
|
||||
class hold_proxy(object):
|
||||
def __init__(self):
|
||||
self.proxy = os.environ.get("http_proxy")
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
def disable(self):
|
||||
os.environ["http_proxy"] = ""
|
||||
os.environ["HTTP_PROXY"] = ""
|
||||
os.environ["https_proxy"] = ""
|
||||
os.environ["HTTPS_PROXY"] = ""
|
||||
|
||||
def enable(self):
|
||||
if self.proxy:
|
||||
os.environ["http_proxy"] = self.proxy
|
||||
os.environ["HTTP_PROXY"] = self.proxy
|
||||
os.environ["https_proxy"] = self.proxy
|
||||
os.environ["HTTPS_PROXY"] = self.proxy
|
||||
|
||||
|
||||
class proxy_env(object):
|
||||
def __init__(self, args):
|
||||
self.logger = logging.getLogger(__name__)
|
||||
self.args = args
|
||||
self.vpn = tool().vpn()
|
||||
|
||||
def Load(self):
|
||||
proxies = None
|
||||
proxy = {}
|
||||
aria2c_proxy = []
|
||||
|
||||
if self.vpn["proxies"]:
|
||||
proxies = self.vpn["proxies"]
|
||||
|
||||
if not self.vpn["proxies"]:
|
||||
if self.args.privtvpn:
|
||||
self.logger.info("Proxy Status: Activated-PrivateVpn")
|
||||
proxy.update({"port": self.vpn["private"]["port"]})
|
||||
proxy.update({"user": self.vpn["private"]["email"]})
|
||||
proxy.update({"pass": self.vpn["private"]["passwd"]})
|
||||
|
||||
if "pvdata.host" in self.args.privtvpn:
|
||||
proxy.update({"host": self.args.privtvpn})
|
||||
else:
|
||||
proxy.update(
|
||||
{"host": connect(code=self.args.privtvpn).privateVPN()}
|
||||
)
|
||||
|
||||
proxies = self.vpn["private"]["http"].format(
|
||||
email=proxy["user"],
|
||||
passwd=proxy["pass"],
|
||||
ip=proxy["host"],
|
||||
port=proxy["port"],
|
||||
)
|
||||
else:
|
||||
if self.args.nordvpn:
|
||||
self.logger.info("Proxy Status: Activated-NordVpn")
|
||||
proxy.update({"port": self.vpn["nordvpn"]["port"]})
|
||||
proxy.update({"user": self.vpn["nordvpn"]["email"]})
|
||||
proxy.update({"pass": self.vpn["nordvpn"]["passwd"]})
|
||||
|
||||
if "nordvpn.com" in self.args.nordvpn:
|
||||
proxy.update({"host": self.args.nordvpn})
|
||||
else:
|
||||
proxy.update(
|
||||
{"host": connect(code=self.args.nordvpn).nordVPN()}
|
||||
)
|
||||
|
||||
proxies = self.vpn["nordvpn"]["http"].format(
|
||||
email=proxy["user"],
|
||||
passwd=proxy["pass"],
|
||||
ip=proxy["host"],
|
||||
port=proxy["port"],
|
||||
)
|
||||
else:
|
||||
self.logger.info("Proxy Status: Off")
|
||||
|
||||
if proxy.get("host"):
|
||||
aria2c_proxy.append(
|
||||
"--https-proxy={}:{}".format(proxy.get("host"), proxy.get("port"))
|
||||
)
|
||||
if proxy.get("user"):
|
||||
aria2c_proxy.append("--https-proxy-user={}".format(proxy.get("user")))
|
||||
if proxy.get("pass"):
|
||||
aria2c_proxy.append("--https-proxy-passwd={}".format(proxy.get("pass")))
|
||||
|
||||
if proxies:
|
||||
os.environ["http_proxy"] = proxies
|
||||
os.environ["HTTP_PROXY"] = proxies
|
||||
os.environ["https_proxy"] = proxies
|
||||
os.environ["HTTPS_PROXY"] = proxies
|
||||
|
||||
ip = None
|
||||
|
||||
try:
|
||||
self.logger.info("Getting IP...")
|
||||
r = requests.get("https://ipinfo.io/json", timeout=5)
|
||||
data = r.json()
|
||||
ip = f'{data["ip"]} ({data["country"]})'
|
||||
except Exception as e:
|
||||
self.logger.info(f"({e.__class__.__name__}: {e})")
|
||||
sys.exit(1)
|
||||
|
||||
return aria2c_proxy, ip
|
132
helpers/pssh_generator.py
Normal file
132
helpers/pssh_generator.py
Normal file
@ -0,0 +1,132 @@
|
||||
from utils.modules.pymp4.parser import Box
|
||||
from io import BytesIO
|
||||
import base64
|
||||
import requests
|
||||
import uuid
|
||||
import binascii
|
||||
import subprocess
|
||||
import logging
|
||||
import json
|
||||
|
||||
|
||||
class pssh_generator(object):
|
||||
def __init__(self, init, **kwargs):
|
||||
self.init = init
|
||||
self.logger = logging.getLogger(__name__)
|
||||
self.proxies = kwargs.get("proxies", None)
|
||||
self.mp4dumpexe = kwargs.get("mp4dumpexe", None)
|
||||
|
||||
def from_kid(self):
|
||||
array_of_bytes = bytearray(b"\x00\x00\x002pssh\x00\x00\x00\x00")
|
||||
array_of_bytes.extend(bytes.fromhex("edef8ba979d64acea3c827dcd51d21ed"))
|
||||
array_of_bytes.extend(b"\x00\x00\x00\x12\x12\x10")
|
||||
array_of_bytes.extend(bytes.fromhex(self.init.replace("-", "")))
|
||||
pssh = base64.b64encode(bytes.fromhex(array_of_bytes.hex()))
|
||||
return pssh.decode()
|
||||
|
||||
def Get_PSSH(self):
|
||||
WV_SYSTEM_ID = "[ed ef 8b a9 79 d6 4a ce a3 c8 27 dc d5 1d 21 ed]"
|
||||
pssh = None
|
||||
data = subprocess.check_output(
|
||||
[self.mp4dumpexe, "--format", "json", "--verbosity", "1", self.init]
|
||||
)
|
||||
data = json.loads(data)
|
||||
for atom in data:
|
||||
if atom["name"] == "moov":
|
||||
for child in atom["children"]:
|
||||
if child["name"] == "pssh":
|
||||
if child["system_id"] == WV_SYSTEM_ID:
|
||||
pssh = child["data"][1:-1].replace(" ", "")
|
||||
pssh = binascii.unhexlify(pssh)
|
||||
if pssh.startswith(b"\x08\x01"):
|
||||
pssh = pssh[0:]
|
||||
pssh = base64.b64encode(pssh).decode("utf-8")
|
||||
return pssh
|
||||
|
||||
if not pssh:
|
||||
self.logger.error("Error while generate pssh from file.")
|
||||
return pssh
|
||||
|
||||
def get_moov_pssh(self, moov):
|
||||
while True:
|
||||
x = Box.parse_stream(moov)
|
||||
if x.type == b"moov":
|
||||
for y in x.children:
|
||||
if y.type == b"pssh" and y.system_ID == uuid.UUID(
|
||||
"edef8ba9-79d6-4ace-a3c8-27dcd51d21ed"
|
||||
):
|
||||
data = base64.b64encode(y.init_data)
|
||||
return data
|
||||
|
||||
def build_init_segment_mp4(self, bytes_):
|
||||
moov = BytesIO(bytes_)
|
||||
data = self.get_moov_pssh(moov)
|
||||
pssh = data.decode("utf-8")
|
||||
|
||||
return pssh
|
||||
|
||||
def getInitWithRange2(self, headers):
|
||||
|
||||
initbytes = requests.get(url=self.init, proxies=self.proxies, headers=headers,)
|
||||
|
||||
try:
|
||||
pssh = self.build_init_segment_mp4(initbytes.content)
|
||||
return pssh
|
||||
except Exception as e:
|
||||
self.logger.info("Error: " + str(e))
|
||||
|
||||
return None
|
||||
|
||||
def getInitWithRange(self, start: int, end: int):
|
||||
|
||||
initbytes = requests.get(
|
||||
url=self.init,
|
||||
proxies=self.proxies,
|
||||
headers={"Range": "bytes={}-{}".format(start, end)},
|
||||
)
|
||||
|
||||
try:
|
||||
pssh = self.build_init_segment_mp4(initbytes.content)
|
||||
return pssh
|
||||
except Exception as e:
|
||||
self.logger.info("Error: " + str(e))
|
||||
|
||||
return None
|
||||
|
||||
def loads(self):
|
||||
req = requests.get(url=self.init, proxies=self.proxies)
|
||||
|
||||
initbytes = req.content
|
||||
|
||||
try:
|
||||
pssh = self.build_init_segment_mp4(initbytes)
|
||||
return pssh
|
||||
except Exception as e:
|
||||
self.logger.error("Error: " + str(e))
|
||||
|
||||
return None
|
||||
|
||||
def load(self):
|
||||
|
||||
with open(self.init, "rb") as f:
|
||||
initbytes = f.read()
|
||||
|
||||
try:
|
||||
pssh = self.build_init_segment_mp4(initbytes)
|
||||
return pssh
|
||||
except Exception as e:
|
||||
self.logger.error("Error: " + str(e))
|
||||
|
||||
return None
|
||||
|
||||
def from_str(self):
|
||||
|
||||
initbytes = self.init
|
||||
|
||||
try:
|
||||
pssh = self.build_init_segment_mp4(initbytes)
|
||||
return pssh
|
||||
except Exception as e:
|
||||
self.logger.info("Error: " + str(e))
|
||||
|
||||
return None
|
820
helpers/ripprocess.py
Normal file
820
helpers/ripprocess.py
Normal file
@ -0,0 +1,820 @@
|
||||
import ffmpy, json, os, sys, unidecode, requests, subprocess, time, pycountry, html, tqdm, re, glob, base64, binascii
|
||||
from titlecase import titlecase
|
||||
from configs.config import tool
|
||||
from helpers.proxy_environ import hold_proxy
|
||||
import tldextract
|
||||
from collections import namedtuple
|
||||
from collections.abc import Sequence
|
||||
from natsort import natsorted
|
||||
import logging
|
||||
import unicodedata, string
|
||||
|
||||
|
||||
class EpisodesNumbersHandler:
|
||||
def __init__(self):
|
||||
return
|
||||
|
||||
def numberRange(self, start: int, end: int):
|
||||
if list(range(start, end + 1)) != []:
|
||||
return list(range(start, end + 1))
|
||||
|
||||
if list(range(end, start + 1)) != []:
|
||||
return list(range(end, start + 1))
|
||||
|
||||
return [start]
|
||||
|
||||
def ListNumber(self, Number: str):
|
||||
if Number.isdigit():
|
||||
return [int(Number)]
|
||||
|
||||
if Number.strip() == "~" or Number.strip() == "":
|
||||
return self.numberRange(1, 999)
|
||||
|
||||
if "-" in Number:
|
||||
start, end = Number.split("-")
|
||||
if start.strip() == "" or end.strip() == "":
|
||||
raise ValueError("wrong Number: {}".format(Number))
|
||||
return self.numberRange(int(start), int(end))
|
||||
|
||||
if "~" in Number:
|
||||
start, _ = Number.split("~")
|
||||
if start.strip() == "":
|
||||
raise ValueError("wrong Number: {}".format(Number))
|
||||
return self.numberRange(int(start), 999)
|
||||
|
||||
return
|
||||
|
||||
def sortNumbers(self, Numbers):
|
||||
SortedNumbers = []
|
||||
for Number in Numbers.split(","):
|
||||
SortedNumbers += self.ListNumber(Number.strip())
|
||||
|
||||
return natsorted(list(set(SortedNumbers)))
|
||||
|
||||
|
||||
class ripprocess(object):
|
||||
def __init__(self):
|
||||
self.tool = tool()
|
||||
self.logger = logging.getLogger(__name__)
|
||||
self.bin = self.tool.bin()
|
||||
|
||||
def sort_list(self, media_list, keyword1=None, keyword2=None):
|
||||
if keyword1:
|
||||
if keyword2:
|
||||
return sorted(
|
||||
media_list, key=lambda k: (int(k[keyword1]), int(k[keyword2]))
|
||||
)
|
||||
else:
|
||||
sorted(media_list, key=lambda k: int(k[keyword1]))
|
||||
|
||||
return media_list
|
||||
|
||||
def yt2json(self, url, proxies=None):
|
||||
jsonfile = "info.info.json"
|
||||
|
||||
yt_cmd = [
|
||||
self.bin["youtube"],
|
||||
"--skip-download",
|
||||
"--write-info-json",
|
||||
"--quiet",
|
||||
"--no-warnings",
|
||||
"-o",
|
||||
"info",
|
||||
url,
|
||||
]
|
||||
|
||||
if proxies:
|
||||
yt_cmd += ["--proxy", proxies.get("https")]
|
||||
|
||||
subprocess.call(yt_cmd)
|
||||
|
||||
while not os.path.isfile(jsonfile):
|
||||
time.sleep(0.2)
|
||||
with open(jsonfile) as js:
|
||||
data = json.load(js)
|
||||
if os.path.isfile(jsonfile):
|
||||
os.remove(jsonfile)
|
||||
|
||||
return data
|
||||
|
||||
def getKeyId(self, mp4_file):
|
||||
data = subprocess.check_output(
|
||||
[self.bin["mp4dump"], "--format", "json", "--verbosity", "1", mp4_file]
|
||||
)
|
||||
try:
|
||||
return re.sub(
|
||||
" ",
|
||||
"",
|
||||
re.compile(r"default_KID.*\[(.*)\]").search(data.decode()).group(1),
|
||||
)
|
||||
except AttributeError:
|
||||
return None
|
||||
|
||||
def flatten(self, l):
|
||||
return list(self.flatten_g(l))
|
||||
|
||||
def flatten_g(self, l):
|
||||
basestring = (str, bytes)
|
||||
for el in l:
|
||||
if isinstance(el, Sequence) and not isinstance(el, basestring):
|
||||
for sub in self.flatten_g(el):
|
||||
yield sub
|
||||
else:
|
||||
yield el
|
||||
|
||||
def removeExtentsion(self, string: str):
|
||||
if "." in string:
|
||||
return ".".join(string.split(".")[:-1])
|
||||
else:
|
||||
raise ValueError("string has no extentsion: {}".format(string))
|
||||
|
||||
def replaceExtentsion(self, string: str, ext: str):
|
||||
if "." in string:
|
||||
return ".".join(string.split(".")[:-1]) + f".{ext}"
|
||||
else:
|
||||
raise ValueError("string has no extentsion: {}".format(string))
|
||||
|
||||
def domain(self, url):
|
||||
return "{0.domain}.{0.suffix}".format(tldextract.extract(url))
|
||||
|
||||
def remove_dups(self, List, keyword=""):
|
||||
Added_ = set()
|
||||
Proper_ = []
|
||||
for L in List:
|
||||
if L[keyword] not in Added_:
|
||||
Proper_.append(L)
|
||||
Added_.add(L[keyword])
|
||||
|
||||
return Proper_
|
||||
|
||||
def find_str(self, s, char):
|
||||
index = 0
|
||||
|
||||
if char in s:
|
||||
c = char[0]
|
||||
for ch in s:
|
||||
if ch == c:
|
||||
if s[index : index + len(char)] == char:
|
||||
return index
|
||||
|
||||
index += 1
|
||||
|
||||
return -1
|
||||
|
||||
def updt(self, total, progress):
|
||||
barLength, status = 80, ""
|
||||
progress = float(progress) / float(total)
|
||||
if progress >= 1.0:
|
||||
progress, status = 1, "\r\n"
|
||||
block = int(round(barLength * progress))
|
||||
text = "\rProgress: {} | {:.0f}% {}".format(
|
||||
"█" * block + "" * (barLength - block), round(progress * 100, 0), status,
|
||||
)
|
||||
sys.stdout.write(text)
|
||||
sys.stdout.flush()
|
||||
|
||||
def Get_PSSH(self, mp4_file):
|
||||
WV_SYSTEM_ID = "[ed ef 8b a9 79 d6 4a ce a3 c8 27 dc d5 1d 21 ed]"
|
||||
pssh = None
|
||||
data = subprocess.check_output(
|
||||
[self.bin["mp4dump"], "--format", "json", "--verbosity", "1", mp4_file]
|
||||
)
|
||||
data = json.loads(data)
|
||||
for atom in data:
|
||||
if atom["name"] == "moov":
|
||||
for child in atom["children"]:
|
||||
if child["name"] == "pssh":
|
||||
if child["system_id"] == WV_SYSTEM_ID:
|
||||
pssh = child["data"][1:-1].replace(" ", "")
|
||||
pssh = binascii.unhexlify(pssh)
|
||||
if pssh.startswith(b"\x08\x01"):
|
||||
pssh = pssh[0:]
|
||||
pssh = base64.b64encode(pssh).decode("utf-8")
|
||||
return pssh
|
||||
|
||||
return None
|
||||
|
||||
def SubtitleEdit(
|
||||
self, contain=None, file=None, removeSDH=False, silent=True, extra_commands=[]
|
||||
):
|
||||
if file:
|
||||
subtitle_command = [
|
||||
self.bin["SubtitleEdit"],
|
||||
"/convert",
|
||||
file,
|
||||
"srt",
|
||||
"/overwrite",
|
||||
"/multiplereplace:.",
|
||||
"/MergeShortLines",
|
||||
"/FixCommonErrors",
|
||||
]
|
||||
|
||||
subtitle_command += extra_commands
|
||||
|
||||
if removeSDH:
|
||||
subtitle_command.append("/RemoveTextForHI")
|
||||
|
||||
subprocess.call(
|
||||
subtitle_command, stdout=open(os.devnull, "wb")
|
||||
) if silent else subprocess.call(subtitle_command)
|
||||
|
||||
if contain:
|
||||
subtitle_command = [
|
||||
self.bin["SubtitleEdit"],
|
||||
"/convert",
|
||||
"{}*.srt".format(contain),
|
||||
"srt",
|
||||
"/overwrite",
|
||||
"/multiplereplace:.",
|
||||
"/MergeShortLines",
|
||||
"/FixCommonErrors",
|
||||
]
|
||||
|
||||
subtitle_command += extra_commands
|
||||
|
||||
if removeSDH:
|
||||
subtitle_command.append("/removetextforhi")
|
||||
|
||||
subprocess.call(
|
||||
subtitle_command, stdout=open(os.devnull, "wb")
|
||||
) if silent else subprocess.call(subtitle_command)
|
||||
|
||||
return
|
||||
|
||||
def parseCookieFile(self, cookiesfile):
|
||||
cookies = {}
|
||||
with open(cookiesfile, "r") as fp:
|
||||
for line in fp:
|
||||
if not re.match(r"^\#", line):
|
||||
lineFields = line.strip().split("\t")
|
||||
try:
|
||||
cookies[lineFields[5]] = lineFields[6]
|
||||
except Exception:
|
||||
pass
|
||||
return cookies
|
||||
|
||||
def ReplaceCodeLanguages(self, X):
|
||||
X = X.lower()
|
||||
X = (
|
||||
X.replace("_subtitle_dialog_0", "")
|
||||
.replace("_narrative_dialog_0", "")
|
||||
.replace("_caption_dialog_0", "")
|
||||
.replace("_dialog_0", "")
|
||||
.replace("_descriptive_0", "_descriptive")
|
||||
.replace("_descriptive", "_descriptive")
|
||||
.replace("_sdh", "-sdh")
|
||||
.replace("es-es", "es")
|
||||
.replace("en-es", "es")
|
||||
.replace("kn-in", "kn")
|
||||
.replace("gu-in", "gu")
|
||||
.replace("ja-jp", "ja")
|
||||
.replace("mni-in", "mni")
|
||||
.replace("si-in", "si")
|
||||
.replace("as-in", "as")
|
||||
.replace("ml-in", "ml")
|
||||
.replace("sv-se", "sv")
|
||||
.replace("hy-hy", "hy")
|
||||
.replace("sv-sv", "sv")
|
||||
.replace("da-da", "da")
|
||||
.replace("fi-fi", "fi")
|
||||
.replace("nb-nb", "nb")
|
||||
.replace("is-is", "is")
|
||||
.replace("uk-uk", "uk")
|
||||
.replace("hu-hu", "hu")
|
||||
.replace("bg-bg", "bg")
|
||||
.replace("hr-hr", "hr")
|
||||
.replace("lt-lt", "lt")
|
||||
.replace("et-et", "et")
|
||||
.replace("el-el", "el")
|
||||
.replace("he-he", "he")
|
||||
.replace("ar-ar", "ar")
|
||||
.replace("fa-fa", "fa")
|
||||
.replace("ro-ro", "ro")
|
||||
.replace("sr-sr", "sr")
|
||||
.replace("cs-cs", "cs")
|
||||
.replace("sk-sk", "sk")
|
||||
.replace("mk-mk", "mk")
|
||||
.replace("hi-hi", "hi")
|
||||
.replace("bn-bn", "bn")
|
||||
.replace("ur-ur", "ur")
|
||||
.replace("pa-pa", "pa")
|
||||
.replace("ta-ta", "ta")
|
||||
.replace("te-te", "te")
|
||||
.replace("mr-mr", "mr")
|
||||
.replace("kn-kn", "kn")
|
||||
.replace("gu-gu", "gu")
|
||||
.replace("ml-ml", "ml")
|
||||
.replace("si-si", "si")
|
||||
.replace("as-as", "as")
|
||||
.replace("mni-mni", "mni")
|
||||
.replace("tl-tl", "tl")
|
||||
.replace("id-id", "id")
|
||||
.replace("ms-ms", "ms")
|
||||
.replace("vi-vi", "vi")
|
||||
.replace("th-th", "th")
|
||||
.replace("km-km", "km")
|
||||
.replace("ko-ko", "ko")
|
||||
.replace("zh-zh", "zh")
|
||||
.replace("ja-ja", "ja")
|
||||
.replace("ru-ru", "ru")
|
||||
.replace("tr-tr", "tr")
|
||||
.replace("it-it", "it")
|
||||
.replace("es-mx", "es-la")
|
||||
.replace("ar-sa", "ar")
|
||||
.replace("zh-cn", "zh")
|
||||
.replace("nl-nl", "nl")
|
||||
.replace("pl-pl", "pl")
|
||||
.replace("pt-pt", "pt")
|
||||
.replace("hi-in", "hi")
|
||||
.replace("mr-in", "mr")
|
||||
.replace("bn-in", "bn")
|
||||
.replace("te-in", "te")
|
||||
.replace("cmn-hans", "zh-hans")
|
||||
.replace("cmn-hant", "zh-hant")
|
||||
.replace("ko-kr", "ko")
|
||||
.replace("en-au", "en")
|
||||
.replace("es-419", "es-la")
|
||||
.replace("es-us", "es-la")
|
||||
.replace("en-us", "en")
|
||||
.replace("en-gb", "en")
|
||||
.replace("fr-fr", "fr")
|
||||
.replace("de-de", "de")
|
||||
.replace("las-419", "es-la")
|
||||
.replace("ar-ae", "ar")
|
||||
.replace("da-dk", "da")
|
||||
.replace("yue-hant", "yue")
|
||||
.replace("bn-in", "bn")
|
||||
.replace("ur-in", "ur")
|
||||
.replace("ta-in", "ta")
|
||||
.replace("sl-si", "sl")
|
||||
.replace("cs-cz", "cs")
|
||||
.replace("hi-jp", "hi")
|
||||
.replace("-001", "")
|
||||
.replace("en-US", "en")
|
||||
.replace("deu", "de")
|
||||
.replace("eng", "en")
|
||||
.replace("ca-es", "cat")
|
||||
.replace("fil-ph", "fil")
|
||||
.replace("en-ca", "en")
|
||||
.replace("eu-es", "eu")
|
||||
.replace("ar-eg", "ar")
|
||||
.replace("he-il", "he")
|
||||
.replace("el-gr", "he")
|
||||
.replace("nb-no", "nb")
|
||||
.replace("es-ar", "es-la")
|
||||
.replace("en-ph", "en")
|
||||
.replace("sq-al", "sq")
|
||||
.replace("bs-ba", "bs")
|
||||
)
|
||||
|
||||
return X
|
||||
|
||||
def countrycode(self, code, site_domain="None"):
|
||||
languageCodes = {
|
||||
"zh-Hans": "zhoS",
|
||||
"zh-Hant": "zhoT",
|
||||
"pt-BR": "brPor",
|
||||
"es-ES": "euSpa",
|
||||
"en-GB": "enGB",
|
||||
"en-PH": "enPH",
|
||||
"nl-BE": "nlBE",
|
||||
"fil": "enPH",
|
||||
"yue": "zhoS",
|
||||
"fr-CA": "caFra",
|
||||
}
|
||||
|
||||
if code == "cmn-Hans":
|
||||
return "Mandarin Chinese (Simplified)", "zh-Hans"
|
||||
elif code == "cmn-Hant":
|
||||
return "Mandarin Chinese (Traditional)", "zh-Hant"
|
||||
elif code == "es-419":
|
||||
return "Spanish", "spa"
|
||||
elif code == "es-ES":
|
||||
return "European Spanish", "euSpa"
|
||||
elif code == "pt-BR":
|
||||
return "Brazilian Portuguese", "brPor"
|
||||
elif code == "pt-PT":
|
||||
return "Portuguese", "por"
|
||||
elif code == "fr-CA":
|
||||
return "French Canadian", "caFra"
|
||||
elif code == "fr-FR":
|
||||
return "French", "fra"
|
||||
elif code == "iw":
|
||||
return "Modern Hebrew", "heb"
|
||||
elif code == "es" and site_domain == "google":
|
||||
return "European Spanish", "euSpa"
|
||||
|
||||
lang_code = code[: code.index("-")] if "-" in code else code
|
||||
lang = pycountry.languages.get(alpha_2=lang_code)
|
||||
if lang is None:
|
||||
lang = pycountry.languages.get(alpha_3=lang_code)
|
||||
|
||||
try:
|
||||
languagecode = languageCodes[code]
|
||||
except KeyError:
|
||||
languagecode = lang.alpha_3
|
||||
|
||||
return lang.name, languagecode
|
||||
|
||||
def tqdm_downloader(self, url, file_name, proxies=None):
|
||||
# self.logger.info(file_name)
|
||||
r = requests.get(url, stream=True)
|
||||
file_size = int(r.headers["Content-Length"])
|
||||
chunk = 1
|
||||
chunk_size = 1024
|
||||
num_bars = int(file_size / chunk_size)
|
||||
|
||||
with open(file_name, "wb") as fp:
|
||||
for chunk in tqdm.tqdm(
|
||||
r.iter_content(chunk_size=chunk_size),
|
||||
total=num_bars,
|
||||
unit="KB",
|
||||
desc=file_name,
|
||||
leave=True, # progressbar stays
|
||||
):
|
||||
fp.write(chunk)
|
||||
|
||||
return
|
||||
|
||||
def silent_aria2c_download(self, url, file_name, disable_proxy=True):
|
||||
holder = hold_proxy()
|
||||
|
||||
if disable_proxy:
|
||||
holder.disable()
|
||||
|
||||
commands = [
|
||||
self.bin["aria2c"],
|
||||
url,
|
||||
'--user-agent="Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/76.0.3809.132 Safari/537.36"',
|
||||
"--allow-overwrite=true",
|
||||
"--auto-file-renaming=false",
|
||||
"--retry-wait=5",
|
||||
"-x16",
|
||||
"-j16",
|
||||
"-s16",
|
||||
"-o",
|
||||
file_name,
|
||||
]
|
||||
|
||||
try:
|
||||
aria = subprocess.call(commands, stdout=open(os.devnull, "wb"),)
|
||||
except FileNotFoundError:
|
||||
self.logger.info("UNABLE TO FIND {}".format("aria2c.exe"))
|
||||
exit(-1)
|
||||
if aria != 0:
|
||||
raise ValueError("Aria2c exited with code {}".format(aria))
|
||||
|
||||
if disable_proxy:
|
||||
holder.enable()
|
||||
|
||||
def aria2c_download(self, commands, extra_commands, disable_proxy=False):
|
||||
LogFile = self.bin["aria2c"].replace("exe", "log")
|
||||
|
||||
if os.path.isfile(LogFile):
|
||||
os.remove(LogFile)
|
||||
|
||||
aria2_commands = []
|
||||
aria2_commands.append(self.bin["aria2c"])
|
||||
aria2_commands.append("--log={}".format(LogFile))
|
||||
aria2_commands += commands + extra_commands
|
||||
|
||||
holder = hold_proxy()
|
||||
|
||||
if disable_proxy:
|
||||
holder.disable()
|
||||
|
||||
try:
|
||||
aria = subprocess.call(aria2_commands)
|
||||
except FileNotFoundError:
|
||||
self.logger.info("UNABLE TO FIND {}".format("aria2c.exe"))
|
||||
exit(-1)
|
||||
if aria != 0:
|
||||
self.logger.info("Aria2c exited with code {}".format(aria))
|
||||
exit(-1)
|
||||
|
||||
if disable_proxy:
|
||||
holder.enable()
|
||||
|
||||
self.logger.info()
|
||||
|
||||
def isduplelist(self, a, b):
|
||||
return set(a) == set(b) and len(a) == len(b)
|
||||
|
||||
def readfile(self, file, lines=False):
|
||||
read = ""
|
||||
if os.path.isfile(file):
|
||||
with open(file, "r") as f:
|
||||
if lines:
|
||||
read = f.readlines()
|
||||
return read
|
||||
read = f.read()
|
||||
else:
|
||||
self.logger.info("File: %s, is not found" % file)
|
||||
return None
|
||||
|
||||
return read
|
||||
|
||||
def strip(self, inputint, left=True, right=False):
|
||||
if left:
|
||||
return str(inputint.lstrip("0"))
|
||||
if right:
|
||||
return str(inputint.rstrip("0"))
|
||||
|
||||
return
|
||||
|
||||
def CleanMyFileNamePlease(self, filename):
|
||||
# edit here...
|
||||
filename = filename.replace("666", "666")
|
||||
|
||||
################################################################################################################################
|
||||
# dont edit here...
|
||||
filename = (
|
||||
filename.replace(" ", ".")
|
||||
.replace("'", "")
|
||||
.replace(",", "")
|
||||
.replace("-", "")
|
||||
.replace("-.", ".")
|
||||
.replace(".-.", ".")
|
||||
)
|
||||
filename = re.sub(" +", ".", filename)
|
||||
for i in range(10):
|
||||
filename = re.sub(r"(\.\.)", ".", filename)
|
||||
|
||||
return filename
|
||||
|
||||
def RemoveExtraWords(self, name):
|
||||
if re.search("[eE]pisode [0-9]+", name):
|
||||
name = name.replace((re.search("[eE]pisode [0-9]+", name)).group(0), "")
|
||||
|
||||
if re.search(r"(\(.+?)\)", name):
|
||||
name = name.replace(re.search(r"(\(.+?)\)", name).group(), "")
|
||||
|
||||
name = re.sub(" +", " ", name)
|
||||
name = name.strip()
|
||||
name = (
|
||||
name.replace(" : ", " - ")
|
||||
.replace(": ", " - ")
|
||||
.replace(":", " - ")
|
||||
.replace("&", "and")
|
||||
.replace("ó", "o")
|
||||
.replace("*", "x")
|
||||
)
|
||||
|
||||
return name
|
||||
|
||||
def DecodeString(self, text):
|
||||
for encoding in ("utf-8-sig", "utf-8", "utf-16"):
|
||||
try:
|
||||
return text.decode(encoding)
|
||||
except UnicodeDecodeError:
|
||||
continue
|
||||
|
||||
return text.decode("latin-1")
|
||||
|
||||
def EncodeString(self, text):
|
||||
for encoding in ("utf-8-sig", "utf-8", "utf-16"):
|
||||
try:
|
||||
return text.encode(encoding)
|
||||
except UnicodeDecodeError:
|
||||
continue
|
||||
|
||||
return text.encode("latin-1")
|
||||
|
||||
def clean_text(self, text):
|
||||
whitelist = (
|
||||
"-_.() %s%s" % (string.ascii_letters, string.digits) + "',&#$%@`~!^&+=[]{}"
|
||||
)
|
||||
|
||||
cleaned_text = (
|
||||
unicodedata.normalize("NFKD", text).encode("ASCII", "ignore").decode()
|
||||
)
|
||||
|
||||
return "".join(c for c in cleaned_text if c in whitelist)
|
||||
|
||||
def RemoveCharcters(self, text):
|
||||
text = self.EncodeString(text)
|
||||
text = self.DecodeString(text)
|
||||
text = self.RemoveExtraWords(text)
|
||||
text = self.clean_text(text)
|
||||
text = unidecode.unidecode(titlecase(text))
|
||||
|
||||
return text
|
||||
|
||||
def do_clean(self, contain, exclude=[], added=[]):
|
||||
"""contain= string name in the file/files you want to delete.
|
||||
exclude= the files that has a specified extension you do not want to delete. send by list like ['.sfv', '.whatever']
|
||||
added= another extensions not in the default extension. send by list like ['.sfv', '.whatever']"""
|
||||
|
||||
error = []
|
||||
extensions = [
|
||||
".mp4",
|
||||
".h265",
|
||||
".h264",
|
||||
".eac3",
|
||||
".m4a",
|
||||
".ac3",
|
||||
".srt",
|
||||
".vtt",
|
||||
".txt",
|
||||
".aac",
|
||||
".m3u8",
|
||||
".mpd",
|
||||
]
|
||||
|
||||
extensions += added
|
||||
|
||||
erased_files = []
|
||||
|
||||
for ext in extensions:
|
||||
if ext not in exclude:
|
||||
erased_files += glob.glob(contain + f"*{ext}")
|
||||
|
||||
if not erased_files == []:
|
||||
for files in erased_files:
|
||||
try:
|
||||
os.remove(files)
|
||||
except Exception:
|
||||
error.append(files)
|
||||
|
||||
if not error == []:
|
||||
self.logger.info(
|
||||
f"some files not deleted with extensions: "
|
||||
+ ", ".join(str(x) for x in error)
|
||||
+ "."
|
||||
)
|
||||
|
||||
return
|
||||
|
||||
def mediainfo_(self, file):
|
||||
mediainfo_output = subprocess.Popen(
|
||||
[self.bin["MediaInfo"], "--Output=JSON", "-f", file],
|
||||
stdout=subprocess.PIPE,
|
||||
)
|
||||
mediainfo_json = json.load(mediainfo_output.stdout)
|
||||
return mediainfo_json
|
||||
|
||||
def DemuxAudio(self, inputName, replace_str):
|
||||
if os.path.isfile(inputName):
|
||||
self.logger.info("\nDemuxing audio...")
|
||||
mediainfo = self.mediainfo_(inputName)
|
||||
for m in mediainfo["media"]["track"]:
|
||||
if m["@type"] == "Audio":
|
||||
codec_name = m["Format"]
|
||||
|
||||
ext = ".ac3"
|
||||
if codec_name == "AAC":
|
||||
ext = ".m4a"
|
||||
else:
|
||||
if codec_name == "E-AC-3":
|
||||
ext = ".eac3"
|
||||
else:
|
||||
if codec_name == "AC-3":
|
||||
ext = ".ac3"
|
||||
if codec_name == "DTS":
|
||||
ext = ".dts"
|
||||
|
||||
outputName = inputName.replace(replace_str, ext)
|
||||
self.logger.info(("{} -> {}").format(inputName, outputName))
|
||||
ff = ffmpy.FFmpeg(
|
||||
executable=self.bin["ffmpeg"],
|
||||
inputs={inputName: None},
|
||||
outputs={outputName: "-c:a copy"},
|
||||
global_options="-vn -sn -y -hide_banner -loglevel panic",
|
||||
)
|
||||
ff.run()
|
||||
time.sleep(0.05)
|
||||
if os.path.isfile(outputName) and os.path.getsize(outputName) > 1024 * 1024:
|
||||
os.remove(inputName)
|
||||
self.logger.info("Done!")
|
||||
|
||||
return
|
||||
|
||||
def shaka_decrypt(self, encrypted, decrypted, keys, stream):
|
||||
self.logger.info("\nDecrypting: {}".format(encrypted))
|
||||
decrypt_command = [
|
||||
self.bin["shaka-packager"],
|
||||
"--enable_raw_key_decryption",
|
||||
"-quiet",
|
||||
"input={},stream={},output={}".format(encrypted, stream, decrypted),
|
||||
]
|
||||
|
||||
for key in keys:
|
||||
decrypt_command.append("--keys")
|
||||
decrypt_command.append(
|
||||
"key={}:key_id={}".format(
|
||||
key["KEY"], "00000000000000000000000000000000"
|
||||
)
|
||||
)
|
||||
|
||||
self.logger.info("\nDecrypting KEYS: ")
|
||||
for key in keys:
|
||||
self.logger.info(("{}:{}".format(key["KID"], key["KEY"])))
|
||||
|
||||
wvdecrypt_process = subprocess.Popen(decrypt_command)
|
||||
stdoutdata, stderrdata = wvdecrypt_process.communicate()
|
||||
wvdecrypt_process.wait()
|
||||
self.logger.info("Done!")
|
||||
|
||||
return True
|
||||
|
||||
def mp4_decrypt(
|
||||
self,
|
||||
encrypted,
|
||||
decrypted,
|
||||
keys,
|
||||
moded_decrypter=True,
|
||||
no_kid=True,
|
||||
silent=False,
|
||||
):
|
||||
self.logger.info("\nDecrypting: {}".format(encrypted))
|
||||
decrypt_command = [
|
||||
self.bin["mp4decrypt"]
|
||||
if not moded_decrypter
|
||||
else self.bin["mp4decrypt_moded"]
|
||||
]
|
||||
decrypt_command += ["--show-progress", encrypted, decrypted]
|
||||
|
||||
for key in keys:
|
||||
decrypt_command.append("--key")
|
||||
decrypt_command.append(
|
||||
"{}:{}".format(key["ID"] if no_kid else key["KID"], key["KEY"])
|
||||
)
|
||||
|
||||
self.logger.info("\nDecrypting KEYS: ")
|
||||
for key in keys:
|
||||
self.logger.info(
|
||||
("{}:{}".format(key["ID"] if no_kid else key["KID"], key["KEY"]))
|
||||
)
|
||||
|
||||
if silent:
|
||||
wvdecrypt_process = subprocess.Popen(
|
||||
decrypt_command, stdout=open(os.devnull, "wb")
|
||||
)
|
||||
else:
|
||||
wvdecrypt_process = subprocess.Popen(decrypt_command)
|
||||
|
||||
stdoutdata, stderrdata = wvdecrypt_process.communicate()
|
||||
wvdecrypt_process.wait()
|
||||
if wvdecrypt_process.returncode == 0:
|
||||
self.logger.info("Done!")
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def DemuxVideo(
|
||||
self,
|
||||
outputVideoTemp,
|
||||
outputVideo,
|
||||
ffmpeg=False,
|
||||
mp4box=False,
|
||||
ffmpeg_version="ffmpeg",
|
||||
):
|
||||
if ffmpeg:
|
||||
self.logger.info("\nRemuxing video...")
|
||||
# if not outputVideo.endswith(".h264"):
|
||||
# os.rename(outputVideoTemp, outputVideo)
|
||||
# self.logger.info("Done!")
|
||||
# return True
|
||||
|
||||
ff = ffmpy.FFmpeg(
|
||||
executable=self.bin[ffmpeg_version],
|
||||
inputs={outputVideoTemp: None},
|
||||
outputs={outputVideo: "-c copy"},
|
||||
global_options="-y -hide_banner -loglevel panic",
|
||||
).run()
|
||||
time.sleep(0.05)
|
||||
if (
|
||||
os.path.isfile(outputVideo)
|
||||
and os.path.getsize(outputVideo) > 1024 * 1024
|
||||
):
|
||||
os.remove(outputVideoTemp)
|
||||
self.logger.info("Done!")
|
||||
return True
|
||||
|
||||
if mp4box:
|
||||
self.logger.info("\nRemuxing video...")
|
||||
if not outputVideo.endswith(".h264"):
|
||||
os.rename(outputVideoTemp, outputVideo)
|
||||
self.logger.info("Done!")
|
||||
return True
|
||||
|
||||
subprocess.call(
|
||||
[
|
||||
self.bin["mp4box"],
|
||||
"-quiet",
|
||||
"-raw",
|
||||
"1",
|
||||
"-out",
|
||||
outputVideo,
|
||||
outputVideoTemp,
|
||||
]
|
||||
)
|
||||
if (
|
||||
os.path.isfile(outputVideo)
|
||||
and os.path.getsize(outputVideo) > 1024 * 1024
|
||||
):
|
||||
os.remove(outputVideoTemp)
|
||||
self.logger.info("Done!")
|
||||
return True
|
||||
|
||||
return False
|
90
helpers/sdh.py
Normal file
90
helpers/sdh.py
Normal file
@ -0,0 +1,90 @@
|
||||
import codecs
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
|
||||
import pysrt
|
||||
|
||||
|
||||
class sdh_remover:
|
||||
def __init__(self,):
|
||||
self.__replace__ = "empty_line"
|
||||
self.content = []
|
||||
|
||||
def cleanLine(self, line, regex):
|
||||
line = re.sub("</i>", "", line)
|
||||
line = re.sub("<i>", "", line)
|
||||
if re.search(r"\[(.*)?\n(.*)?\]", line):
|
||||
line = re.sub(
|
||||
re.search(r"\[(.*)?\n(.*)?\]", line).group(), self.__replace__, line
|
||||
)
|
||||
|
||||
if re.search(r"\((.*)?\n(.*)?\)", line):
|
||||
line = re.sub(
|
||||
re.search(r"\((.*)?\n(.*)?\)", line).group(), self.__replace__, line
|
||||
)
|
||||
|
||||
try:
|
||||
# is it inside a markup tag?
|
||||
match = regex.match(line).group(1)
|
||||
tag = re.compile("(<[A-z]+[^>]*>)").match(match).group(1)
|
||||
line = re.sub(match, tag + self.__replace__, line)
|
||||
except:
|
||||
try:
|
||||
line = re.sub(regex, self.__replace__, line)
|
||||
except:
|
||||
pass
|
||||
return line
|
||||
|
||||
def _save(self, Output):
|
||||
|
||||
file = codecs.open(Output, "w", encoding="utf-8")
|
||||
|
||||
for idx, text in enumerate(self.content, start=1):
|
||||
file.write(
|
||||
"{}\n{} --> {}\n{}\n\n".format(
|
||||
str(idx), text["start"], text["end"], text["text"].strip(),
|
||||
)
|
||||
)
|
||||
|
||||
file.close()
|
||||
|
||||
def clean(self):
|
||||
if not self.content == []:
|
||||
temp = self.content
|
||||
self.content = []
|
||||
|
||||
for text in temp:
|
||||
if text["text"].strip() == self.__replace__:
|
||||
continue
|
||||
text.update({"text": re.sub(self.__replace__, "", text["text"])})
|
||||
|
||||
if not text["text"].strip() == "":
|
||||
self.content.append(text)
|
||||
|
||||
return
|
||||
|
||||
def noHI(self, Input=None, Output=None, content=None):
|
||||
|
||||
srt = pysrt.open(Input, encoding="utf-8")
|
||||
for idx, line in enumerate(srt, start=1):
|
||||
number = str(idx)
|
||||
start = line.start
|
||||
end = line.end
|
||||
text = line.text
|
||||
|
||||
text = self.cleanLine(text, re.compile(r"(\[(.+)?\]|\[(.+)?|^(.+)?\])"))
|
||||
text = self.cleanLine(text, re.compile(r"(\((.+)?\)|\((.+)?|^(.+)?\))"))
|
||||
text = self.cleanLine(text, re.compile(r"(\[(.+)?\]|\[(.+)?|^(.+)?\])"))
|
||||
text = self.cleanLine(
|
||||
text,
|
||||
re.compile(r"([♩♪♫♭♮♯]+(.+)?[♩♪♫♭♮♯]+|[♩♪♫♭♮♯]+(.+)?|^(.+)?[♩♪♫♭♮♯]+)"),
|
||||
)
|
||||
text = self.cleanLine(text, re.compile(r"(<font[^>]*>)|(<\/font>)"))
|
||||
|
||||
self.content.append(
|
||||
{"number": number, "start": start, "end": end, "text": text,}
|
||||
)
|
||||
|
||||
self.clean()
|
||||
self._save(Output)
|
135
helpers/vpn.py
Normal file
135
helpers/vpn.py
Normal file
@ -0,0 +1,135 @@
|
||||
import os
|
||||
import requests
|
||||
import sys
|
||||
import random
|
||||
import logging
|
||||
|
||||
class connect(object):
|
||||
def __init__(self, code):
|
||||
self.code = code.lower()
|
||||
self.logger = logging.getLogger(__name__)
|
||||
self.headers = {
|
||||
"user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/79.0.3945.130 Safari/537.36"
|
||||
}
|
||||
|
||||
def nordVPN(self):
|
||||
nordvpn_codes = {
|
||||
"al": "2",
|
||||
"ar": "10",
|
||||
"au": "13",
|
||||
"at": "14",
|
||||
"be": "21",
|
||||
"ba": "27",
|
||||
"br": "30",
|
||||
"bg": "33",
|
||||
"ca": "38",
|
||||
"cl": "43",
|
||||
"cr": "52",
|
||||
"hr": "54",
|
||||
"cy": "56",
|
||||
"cz": "57",
|
||||
"dk": "58",
|
||||
"eg": "64",
|
||||
"ee": "68",
|
||||
"fi": "73",
|
||||
"fr": "74",
|
||||
"ge": "80",
|
||||
"de": "81",
|
||||
"gr": "84",
|
||||
"hk": "97",
|
||||
"hu": "98",
|
||||
"is": "99",
|
||||
"in": "100",
|
||||
"id": "101",
|
||||
"ie": "104",
|
||||
"il": "105",
|
||||
"it": "106",
|
||||
"jp": "108",
|
||||
"lv": "119",
|
||||
"lu": "126",
|
||||
"my": "131",
|
||||
"mx": "140",
|
||||
"md": "142",
|
||||
"nl": "153",
|
||||
"nz": "156",
|
||||
"mk": "128",
|
||||
"no": "163",
|
||||
"ro": "179",
|
||||
"pl": "174",
|
||||
"si": "197",
|
||||
"za": "200",
|
||||
"kr": "114",
|
||||
"rs": "192",
|
||||
"sg": "195",
|
||||
"sk": "196",
|
||||
"es": "202",
|
||||
"se": "208",
|
||||
"ch": "209",
|
||||
"tw": "211",
|
||||
"th": "214",
|
||||
"tr": "220",
|
||||
"ua": "225",
|
||||
"ae": "226",
|
||||
"gb": "227",
|
||||
"us": "228",
|
||||
"vn": "234",
|
||||
"uk": "227",
|
||||
}
|
||||
nord_proxy = {}
|
||||
if nordvpn_codes.get(self.code):
|
||||
resp = requests.get(
|
||||
url="https://nordvpn.com/wp-admin/admin-ajax.php?action=servers_recommendations&filters={%22country_id%22:"
|
||||
+ nordvpn_codes.get(self.code)
|
||||
+ "}",
|
||||
headers=self.headers,
|
||||
)
|
||||
nord_proxy = resp.json()[0]["hostname"]
|
||||
else:
|
||||
self.logger.info(
|
||||
self.code
|
||||
+ " : not listed in country codes, read country.doc for more info"
|
||||
)
|
||||
|
||||
return nord_proxy
|
||||
|
||||
def load_privatevpn(self):
|
||||
html_file = "html.html"
|
||||
hosts = []
|
||||
resp = requests.get(
|
||||
"https://privatevpn.com/serverlist/", stream=True, headers=self.headers
|
||||
)
|
||||
resp = str(resp.text)
|
||||
resp = resp.replace("<br>", "")
|
||||
|
||||
with open(html_file, "w", encoding="utf8") as file:
|
||||
file.write(resp)
|
||||
|
||||
with open(html_file, "r") as file:
|
||||
text = file.readlines()
|
||||
|
||||
if os.path.exists(html_file):
|
||||
os.remove(html_file)
|
||||
|
||||
for p in text:
|
||||
if ".pvdata.host" in p:
|
||||
hosts.append(p.strip())
|
||||
|
||||
return hosts
|
||||
|
||||
def privateVPN(self):
|
||||
private_proxy = {}
|
||||
private_hosts = self.load_privatevpn()
|
||||
self.logger.debug("private_hosts: {}".format(private_hosts))
|
||||
search_host = [host for host in private_hosts if host[:2] == self.code]
|
||||
if not search_host == []:
|
||||
self.logger.info(f"Founded {str(len(search_host))} Proxies")
|
||||
for n, p in enumerate(search_host):
|
||||
self.logger.info(f"[{str(n+1)}] {p}")
|
||||
inp = input("\nEnter Proxy Number, or Hit Enter for random one: ").strip()
|
||||
if inp == "":
|
||||
return random.choice(search_host)
|
||||
private_proxy = search_host[int(inp) - 1]
|
||||
else:
|
||||
self.logger.info(f"no Proxies Found, you may entered wrong code, or search failed!...")
|
||||
|
||||
return private_proxy
|
3
install.requirements.bat
Normal file
3
install.requirements.bat
Normal file
@ -0,0 +1,3 @@
|
||||
@echo off
|
||||
pip install -r requirements.txt
|
||||
pause
|
136
netflix.py
Normal file
136
netflix.py
Normal file
@ -0,0 +1,136 @@
|
||||
import argparse, json, os, logging
|
||||
from configs.config import tool
|
||||
from helpers.proxy_environ import proxy_env
|
||||
from datetime import datetime
|
||||
from services.netflix import netflix
|
||||
import pyfiglet
|
||||
from rich import print
|
||||
from typing import DefaultDict
|
||||
title = pyfiglet.figlet_format('Netflix 6.0', font='slant')
|
||||
print(f'[magenta]{title}[/magenta]')
|
||||
print("BUY private CDM L1 from wvfuck@protonmail.com")
|
||||
print("Using LenovoTB-X505X")
|
||||
|
||||
script_name = "Netflix Download"
|
||||
script_ver = "6.0.1.0"
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
parser = argparse.ArgumentParser(description=f">>> {script_name} {script_ver} <<<")
|
||||
parser.add_argument("content", nargs="?", help="Content URL or ID")
|
||||
parser.add_argument("-q", dest="customquality", nargs=1, help="For configure quality of video.", default=[])
|
||||
parser.add_argument("-o", dest="output", help="download all assets to directory provided.")
|
||||
parser.add_argument("-f", dest="output_folder", help="force mux .mkv files to directory provided", action="store", default=None)
|
||||
parser.add_argument("--nv", dest="novideo", help="dont download video", action="store_true")
|
||||
parser.add_argument("--na", dest="noaudio", help="dont download audio", action="store_true")
|
||||
parser.add_argument("--ns", dest="nosubs", help="dont download subs", action="store_true")
|
||||
parser.add_argument("-e", dest="episodeStart", help="it will start downloading the season from that episode.", default=None)
|
||||
parser.add_argument("-s", dest="season", help="it will start downloading the from that season.", default=None)
|
||||
parser.add_argument("--keep", dest="keep", help="well keep all files after mux, by default all erased.", action="store_true")
|
||||
parser.add_argument("--only-2ch-audio", dest="only_2ch_audio", help="to force get only eac3 2.0 Ch audios.", action="store_true")
|
||||
parser.add_argument("--alang", dest="audiolang", nargs="*", help="download only selected audio languages", default=[],)
|
||||
parser.add_argument("--AD", '--adlang', dest="AD", nargs="*", help="download only selected audio languages", default=[],)
|
||||
parser.add_argument("--slang", dest="sublang", nargs="*", help="download only selected subtitle languages", default=[],)
|
||||
parser.add_argument("--flang", dest="forcedlang", nargs="*", help="download only selected forced subtitle languages", default=[],)
|
||||
parser.add_argument('-t', "--title", dest="titlecustom", nargs=1, help="Customize the title of the show", default=[],)
|
||||
parser.add_argument('-p', "--prompt", dest="prompt", help="will Enable the yes/no prompt when URLs are grabbed.", action="store_true")
|
||||
parser.add_argument('-keys', "--license", dest="license", help="print all profiles keys and exit.", action="store_true")
|
||||
parser.add_argument("--audio-bitrate", dest="custom_audio_bitrate", nargs=1, help="For configure bitrate of audio.", default=[])
|
||||
parser.add_argument("--aformat-2ch","--audio-format-2ch", dest="aformat_2ch",nargs=1, help="For configure format of audio.", default=[],)
|
||||
parser.add_argument("--aformat-51ch","--audio-format-51ch", dest="aformat_51ch",nargs=1, help="For configure format of audio.", default=[],)
|
||||
parser.add_argument("--android-login", dest="android_login", help="will log netflix using android api and save cookies nd build.", action="store_true",)
|
||||
parser.add_argument("--search", action="store", dest="search", help="download using netflix search for the movie/show.", default=0,)
|
||||
parser.add_argument("--hevc", dest="hevc", help="will return HEVC profile", action="store_true")
|
||||
parser.add_argument("--hdr", dest="hdr", help="will return HDR profile", action="store_true")
|
||||
parser.add_argument("--high", dest="video_high", help="return MSL High Video manifest for hpl videos, usually small size low bitrate.", action="store_true",)
|
||||
parser.add_argument("--main", dest="video_main", help="return MSL Main Video manifest for mpl videos, usually Big size High bitrate.", action="store_true",)
|
||||
parser.add_argument("--main480", dest="video_main480", help="return MSL Main 480p Video manifest for mpl videos.", action="store_true",)
|
||||
parser.add_argument("--check", dest="check", help="hpl vs mpl.", action="store_true",)
|
||||
parser.add_argument("--all-audios", dest="allaudios", help="all download audios of the movie/show", action="store_true",)
|
||||
parser.add_argument("--all-forced", dest="allforcedlang", help="all download forced subs of the movie/show", action="store_true",)
|
||||
parser.add_argument("--no-aria2c", dest="noaria2c", help="not use aria2c for download, will use python downloader.", action="store_true",)
|
||||
|
||||
# PROXY
|
||||
parser.add_argument("--nrd", action="store", dest="nordvpn", help="add country for nordvpn proxies.", default=0,)
|
||||
parser.add_argument("--prv", action="store", dest="privtvpn", help="add country for privtvpn proxies.", default=0,)
|
||||
parser.add_argument("--no-dl-proxy", dest="no_download_proxy", help="do not use proxy will downloading files", action="store_true", default=False,)
|
||||
|
||||
# PACK
|
||||
parser.add_argument("--gr", dest="muxer_group", help="add group name to use that will override the one in config", action="store", default=None)
|
||||
parser.add_argument("--upload", dest="upload_ftp", help="upload the release after packing", action="store_true", default=None)
|
||||
parser.add_argument("--pack", dest="muxer_pack", help="pack the release", action="store_true", default=None)
|
||||
parser.add_argument("--confirm", dest="confirm_upload", help="ask confirming before upload the packed release", action="store_true", default=None)
|
||||
parser.add_argument("--imdb", dest="muxer_imdb", help="add imdb for the title for packing", action="store", default=None)
|
||||
parser.add_argument("--scheme", dest="muxer_scheme", help="set muxer scheme name", default=None)
|
||||
# cleaner
|
||||
parser.add_argument("--clean-add", dest="clean_add", nargs="*", help="add more extension of files to be deleted", default=[],)
|
||||
parser.add_argument("--clean-exclude", dest="clean_exclude", nargs="*", help="add more extension of files to not be deleted", default=[],)
|
||||
parser.add_argument("--log-level", default="info", dest="log_level", choices=["debug", "info", "error", "warning"], help="choose level")
|
||||
parser.add_argument("--log-file", dest="log_file", help="set log file for debug", default=None)
|
||||
args = parser.parse_args()
|
||||
|
||||
start = datetime.now()
|
||||
|
||||
if args.log_file:
|
||||
logging.basicConfig(
|
||||
filename=args.log_file,
|
||||
format="%(asctime)s - %(name)s - %(lineno)d - %(levelname)s - %(message)s",
|
||||
datefmt="%Y-%m-%d %I:%M:%S %p",
|
||||
level=logging.DEBUG,
|
||||
)
|
||||
|
||||
else:
|
||||
if args.log_level.lower() == "info":
|
||||
logging.basicConfig(format="%(message)s", level=logging.INFO)
|
||||
elif args.log_level.lower() == "debug":
|
||||
logging.basicConfig(
|
||||
format="%(asctime)s - %(name)s - %(lineno)d - %(levelname)s - %(message)s",
|
||||
datefmt="%Y-%m-%d %I:%M:%S %p",
|
||||
level=logging.DEBUG,
|
||||
)
|
||||
elif args.log_level.lower() == "warning":
|
||||
logging.basicConfig(
|
||||
format="%(asctime)s - %(name)s - %(lineno)d - %(levelname)s - %(message)s",
|
||||
datefmt="%Y-%m-%d %I:%M:%S %p",
|
||||
level=logging.WARNING,
|
||||
)
|
||||
elif args.log_level.lower() == "error":
|
||||
logging.basicConfig(
|
||||
format="%(asctime)s - %(name)s - %(lineno)d - %(levelname)s - %(message)s",
|
||||
datefmt="%Y-%m-%d %I:%M:%S %p",
|
||||
level=logging.ERROR,
|
||||
)
|
||||
|
||||
logging.getLogger(__name__)
|
||||
|
||||
group = {
|
||||
"UPLOAD": args.upload_ftp,
|
||||
"IMDB": args.muxer_imdb,
|
||||
"SCHEME": args.muxer_scheme,
|
||||
"PACK": args.muxer_pack,
|
||||
"GROUP": args.muxer_group,
|
||||
"CONFIRM": args.confirm_upload,
|
||||
"EXTRA_FOLDER": args.output_folder,
|
||||
}
|
||||
|
||||
# ~ commands
|
||||
proxy, ip = proxy_env(args).Load()
|
||||
commands = {"aria2c_extra_commands": proxy, "group": group}
|
||||
logging.debug(commands)
|
||||
|
||||
if args.license:
|
||||
args.prompt = False
|
||||
|
||||
l = "\n__________________________\n"
|
||||
print(
|
||||
f"\n-- {script_name} --{l}\nVERSION: {script_ver}{l}\nIP: {ip}{l}"
|
||||
)
|
||||
|
||||
netflix_ = netflix(args, commands)
|
||||
netflix_.main_netflix()
|
||||
|
||||
print(
|
||||
"\nNFripper took {} Sec".format(
|
||||
int(float((datetime.now() - start).total_seconds()))
|
||||
)
|
||||
) # total seconds
|
0
pywidevine/cdm/__init__.py
Normal file
0
pywidevine/cdm/__init__.py
Normal file
407
pywidevine/cdm/cdm.py
Normal file
407
pywidevine/cdm/cdm.py
Normal file
@ -0,0 +1,407 @@
|
||||
import base64
|
||||
|
||||
import os
|
||||
import time
|
||||
import binascii
|
||||
|
||||
from google.protobuf.message import DecodeError
|
||||
from google.protobuf import text_format
|
||||
|
||||
from pywidevine.cdm.formats import wv_proto2_pb2 as wv_proto2
|
||||
from pywidevine.cdm.session import Session
|
||||
from pywidevine.cdm.key import Key
|
||||
from Cryptodome.Random import get_random_bytes
|
||||
from Cryptodome.Random import random
|
||||
from Cryptodome.Cipher import PKCS1_OAEP, AES
|
||||
from Cryptodome.Hash import CMAC, SHA256, HMAC, SHA1
|
||||
from Cryptodome.PublicKey import RSA
|
||||
from Cryptodome.Signature import pss
|
||||
from Cryptodome.Util import Padding
|
||||
import logging
|
||||
|
||||
|
||||
class Cdm:
|
||||
def __init__(self):
|
||||
self.logger = logging.getLogger(__name__)
|
||||
self.sessions = {}
|
||||
|
||||
def open_session(self, init_data_b64, device, raw_init_data=None, offline=False):
|
||||
self.logger.debug(
|
||||
"open_session(init_data_b64={}, device={}".format(init_data_b64, device)
|
||||
)
|
||||
#self.logger.info("opening new cdm session")
|
||||
if device.session_id_type == "android":
|
||||
# format: 16 random hexdigits, 2 digit counter, 14 0s
|
||||
rand_ascii = "".join(random.choice("ABCDEF0123456789") for _ in range(16))
|
||||
counter = "01" # this resets regularly so its fine to use 01
|
||||
rest = "00000000000000"
|
||||
session_id = rand_ascii + counter + rest
|
||||
session_id = session_id.encode("ascii")
|
||||
elif device.session_id_type == "chrome":
|
||||
rand_bytes = get_random_bytes(16)
|
||||
session_id = rand_bytes
|
||||
else:
|
||||
# other formats NYI
|
||||
self.logger.error("device type is unusable")
|
||||
return 1
|
||||
if raw_init_data and isinstance(raw_init_data, (bytes, bytearray)):
|
||||
# used for NF key exchange, where they don't provide a valid PSSH
|
||||
init_data = raw_init_data
|
||||
self.raw_pssh = True
|
||||
else:
|
||||
init_data = self._parse_init_data(init_data_b64)
|
||||
self.raw_pssh = False
|
||||
|
||||
if init_data:
|
||||
new_session = Session(session_id, init_data, device, offline)
|
||||
else:
|
||||
self.logger.error("unable to parse init data")
|
||||
return 1
|
||||
self.sessions[session_id] = new_session
|
||||
#self.logger.info("session opened and init data parsed successfully")
|
||||
return session_id
|
||||
|
||||
def _parse_init_data(self, init_data_b64):
|
||||
parsed_init_data = wv_proto2.WidevineCencHeader()
|
||||
try:
|
||||
self.logger.debug("trying to parse init_data directly")
|
||||
parsed_init_data.ParseFromString(base64.b64decode(init_data_b64)[32:])
|
||||
except DecodeError:
|
||||
self.logger.debug(
|
||||
"unable to parse as-is, trying with removed pssh box header"
|
||||
)
|
||||
try:
|
||||
id_bytes = parsed_init_data.ParseFromString(
|
||||
base64.b64decode(init_data_b64)[32:]
|
||||
)
|
||||
except DecodeError:
|
||||
self.logger.error("unable to parse, unsupported init data format")
|
||||
return None
|
||||
self.logger.debug("init_data:")
|
||||
for line in text_format.MessageToString(parsed_init_data).splitlines():
|
||||
self.logger.debug(line)
|
||||
return parsed_init_data
|
||||
|
||||
def close_session(self, session_id):
|
||||
self.logger.debug("close_session(session_id={})".format(session_id))
|
||||
#self.logger.info("closing cdm session")
|
||||
if session_id in self.sessions:
|
||||
self.sessions.pop(session_id)
|
||||
self.logger.info("cdm session closed")
|
||||
return 0
|
||||
else:
|
||||
self.logger.info("session {} not found".format(session_id))
|
||||
return 1
|
||||
|
||||
def set_service_certificate(self, session_id, cert_b64):
|
||||
self.logger.debug(
|
||||
"set_service_certificate(session_id={}, cert={})".format(
|
||||
session_id, cert_b64
|
||||
)
|
||||
)
|
||||
#self.logger.info("setting service certificate")
|
||||
|
||||
if session_id not in self.sessions:
|
||||
self.logger.error("session id doesn't exist")
|
||||
return 1
|
||||
|
||||
session = self.sessions[session_id]
|
||||
|
||||
message = wv_proto2.SignedMessage()
|
||||
|
||||
try:
|
||||
message.ParseFromString(base64.b64decode(cert_b64))
|
||||
except DecodeError:
|
||||
self.logger.error("failed to parse cert as SignedMessage")
|
||||
|
||||
service_certificate = wv_proto2.SignedDeviceCertificate()
|
||||
|
||||
if message.Type:
|
||||
self.logger.debug("service cert provided as signedmessage")
|
||||
try:
|
||||
service_certificate.ParseFromString(message.Msg)
|
||||
except DecodeError:
|
||||
# self.logger.error("failed to parse service certificate")
|
||||
return 1
|
||||
else:
|
||||
self.logger.debug("service cert provided as signeddevicecertificate")
|
||||
try:
|
||||
service_certificate.ParseFromString(base64.b64decode(cert_b64))
|
||||
except DecodeError:
|
||||
# self.logger.error("failed to parse service certificate")
|
||||
return 1
|
||||
|
||||
self.logger.debug("service certificate:")
|
||||
for line in text_format.MessageToString(service_certificate).splitlines():
|
||||
self.logger.debug(line)
|
||||
|
||||
session.service_certificate = service_certificate
|
||||
session.privacy_mode = True
|
||||
|
||||
return 0
|
||||
|
||||
def get_license_request(self, session_id):
|
||||
self.logger.debug("get_license_request(session_id={})".format(session_id))
|
||||
#self.logger.info("getting license request")
|
||||
|
||||
if session_id not in self.sessions:
|
||||
self.logger.error("session ID does not exist")
|
||||
return 1
|
||||
|
||||
session = self.sessions[session_id]
|
||||
|
||||
# raw pssh will be treated as bytes and not parsed
|
||||
if self.raw_pssh:
|
||||
license_request = wv_proto2.SignedLicenseRequestRaw()
|
||||
else:
|
||||
license_request = wv_proto2.SignedLicenseRequest()
|
||||
client_id = wv_proto2.ClientIdentification()
|
||||
|
||||
if not os.path.exists(session.device_config.device_client_id_blob_filename):
|
||||
self.logger.error("no client ID blob available for this device")
|
||||
return 1
|
||||
|
||||
with open(session.device_config.device_client_id_blob_filename, "rb") as f:
|
||||
try:
|
||||
cid_bytes = client_id.ParseFromString(f.read())
|
||||
except DecodeError:
|
||||
self.logger.error("client id failed to parse as protobuf")
|
||||
return 1
|
||||
|
||||
self.logger.debug("building license request")
|
||||
if not self.raw_pssh:
|
||||
license_request.Type = wv_proto2.SignedLicenseRequest.MessageType.Value(
|
||||
"LICENSE_REQUEST"
|
||||
)
|
||||
license_request.Msg.ContentId.CencId.Pssh.CopyFrom(session.init_data)
|
||||
else:
|
||||
license_request.Type = wv_proto2.SignedLicenseRequestRaw.MessageType.Value(
|
||||
"LICENSE_REQUEST"
|
||||
)
|
||||
license_request.Msg.ContentId.CencId.Pssh = session.init_data # bytes
|
||||
|
||||
if session.offline:
|
||||
license_type = wv_proto2.LicenseType.Value("OFFLINE")
|
||||
else:
|
||||
license_type = wv_proto2.LicenseType.Value("DEFAULT")
|
||||
license_request.Msg.ContentId.CencId.LicenseType = license_type
|
||||
license_request.Msg.ContentId.CencId.RequestId = session_id
|
||||
license_request.Msg.Type = wv_proto2.LicenseRequest.RequestType.Value("NEW")
|
||||
license_request.Msg.RequestTime = int(time.time())
|
||||
license_request.Msg.ProtocolVersion = wv_proto2.ProtocolVersion.Value("CURRENT")
|
||||
if session.device_config.send_key_control_nonce:
|
||||
license_request.Msg.KeyControlNonce = random.randrange(1, 2 ** 31)
|
||||
|
||||
if session.privacy_mode:
|
||||
if session.device_config.vmp:
|
||||
self.logger.debug("vmp required, adding to client_id")
|
||||
self.logger.debug("reading vmp hashes")
|
||||
vmp_hashes = wv_proto2.FileHashes()
|
||||
with open(session.device_config.device_vmp_blob_filename, "rb") as f:
|
||||
try:
|
||||
vmp_bytes = vmp_hashes.ParseFromString(f.read())
|
||||
except DecodeError:
|
||||
self.logger.error("vmp hashes failed to parse as protobuf")
|
||||
return 1
|
||||
client_id._FileHashes.CopyFrom(vmp_hashes)
|
||||
self.logger.debug(
|
||||
"privacy mode & service certificate loaded, encrypting client id"
|
||||
)
|
||||
self.logger.debug("unencrypted client id:")
|
||||
for line in text_format.MessageToString(client_id).splitlines():
|
||||
self.logger.debug(line)
|
||||
cid_aes_key = get_random_bytes(16)
|
||||
cid_iv = get_random_bytes(16)
|
||||
|
||||
cid_cipher = AES.new(cid_aes_key, AES.MODE_CBC, cid_iv)
|
||||
|
||||
encrypted_client_id = cid_cipher.encrypt(
|
||||
Padding.pad(client_id.SerializeToString(), 16)
|
||||
)
|
||||
|
||||
service_public_key = RSA.importKey(
|
||||
session.service_certificate._DeviceCertificate.PublicKey
|
||||
)
|
||||
|
||||
service_cipher = PKCS1_OAEP.new(service_public_key)
|
||||
|
||||
encrypted_cid_key = service_cipher.encrypt(cid_aes_key)
|
||||
|
||||
encrypted_client_id_proto = wv_proto2.EncryptedClientIdentification()
|
||||
|
||||
encrypted_client_id_proto.ServiceId = (
|
||||
session.service_certificate._DeviceCertificate.ServiceId
|
||||
)
|
||||
encrypted_client_id_proto.ServiceCertificateSerialNumber = (
|
||||
session.service_certificate._DeviceCertificate.SerialNumber
|
||||
)
|
||||
encrypted_client_id_proto.EncryptedClientId = encrypted_client_id
|
||||
encrypted_client_id_proto.EncryptedClientIdIv = cid_iv
|
||||
encrypted_client_id_proto.EncryptedPrivacyKey = encrypted_cid_key
|
||||
|
||||
license_request.Msg.EncryptedClientId.CopyFrom(encrypted_client_id_proto)
|
||||
else:
|
||||
license_request.Msg.ClientId.CopyFrom(client_id)
|
||||
|
||||
if session.device_config.private_key_available:
|
||||
key = RSA.importKey(
|
||||
open(session.device_config.device_private_key_filename).read()
|
||||
)
|
||||
session.device_key = key
|
||||
else:
|
||||
self.logger.error("need device private key, other methods unimplemented")
|
||||
return 1
|
||||
|
||||
self.logger.debug("signing license request")
|
||||
|
||||
hash = SHA1.new(license_request.Msg.SerializeToString())
|
||||
signature = pss.new(key).sign(hash)
|
||||
|
||||
license_request.Signature = signature
|
||||
|
||||
session.license_request = license_request
|
||||
|
||||
self.logger.debug("license request:")
|
||||
for line in text_format.MessageToString(session.license_request).splitlines():
|
||||
self.logger.debug(line)
|
||||
#self.logger.info("license request created")
|
||||
self.logger.debug(
|
||||
"license request b64: {}".format(
|
||||
base64.b64encode(license_request.SerializeToString())
|
||||
)
|
||||
)
|
||||
return license_request.SerializeToString()
|
||||
|
||||
def provide_license(self, session_id, license_b64):
|
||||
self.logger.debug(
|
||||
"provide_license(session_id={}, license_b64={})".format(
|
||||
session_id, license_b64
|
||||
)
|
||||
)
|
||||
#self.logger.info("decrypting provided license")
|
||||
|
||||
if session_id not in self.sessions:
|
||||
self.logger.error("session does not exist")
|
||||
return 1
|
||||
|
||||
session = self.sessions[session_id]
|
||||
|
||||
if not session.license_request:
|
||||
self.logger.error("generate a license request first!")
|
||||
return 1
|
||||
|
||||
license = wv_proto2.SignedLicense()
|
||||
try:
|
||||
license.ParseFromString(base64.b64decode(license_b64))
|
||||
except DecodeError:
|
||||
self.logger.error("unable to parse license - check protobufs")
|
||||
return 1
|
||||
|
||||
session.license = license
|
||||
|
||||
self.logger.debug("license:")
|
||||
for line in text_format.MessageToString(license).splitlines():
|
||||
self.logger.debug(line)
|
||||
|
||||
self.logger.debug("deriving keys from session key")
|
||||
|
||||
oaep_cipher = PKCS1_OAEP.new(session.device_key)
|
||||
|
||||
session.session_key = oaep_cipher.decrypt(license.SessionKey)
|
||||
|
||||
lic_req_msg = session.license_request.Msg.SerializeToString()
|
||||
|
||||
enc_key_base = b"ENCRYPTION\000" + lic_req_msg + b"\0\0\0\x80"
|
||||
auth_key_base = b"AUTHENTICATION\0" + lic_req_msg + b"\0\0\2\0"
|
||||
|
||||
enc_key = b"\x01" + enc_key_base
|
||||
auth_key_1 = b"\x01" + auth_key_base
|
||||
auth_key_2 = b"\x02" + auth_key_base
|
||||
auth_key_3 = b"\x03" + auth_key_base
|
||||
auth_key_4 = b"\x04" + auth_key_base
|
||||
|
||||
cmac_obj = CMAC.new(session.session_key, ciphermod=AES)
|
||||
cmac_obj.update(enc_key)
|
||||
|
||||
enc_cmac_key = cmac_obj.digest()
|
||||
|
||||
cmac_obj = CMAC.new(session.session_key, ciphermod=AES)
|
||||
cmac_obj.update(auth_key_1)
|
||||
auth_cmac_key_1 = cmac_obj.digest()
|
||||
|
||||
cmac_obj = CMAC.new(session.session_key, ciphermod=AES)
|
||||
cmac_obj.update(auth_key_2)
|
||||
auth_cmac_key_2 = cmac_obj.digest()
|
||||
|
||||
cmac_obj = CMAC.new(session.session_key, ciphermod=AES)
|
||||
cmac_obj.update(auth_key_3)
|
||||
auth_cmac_key_3 = cmac_obj.digest()
|
||||
|
||||
cmac_obj = CMAC.new(session.session_key, ciphermod=AES)
|
||||
cmac_obj.update(auth_key_4)
|
||||
auth_cmac_key_4 = cmac_obj.digest()
|
||||
|
||||
auth_cmac_combined_1 = auth_cmac_key_1 + auth_cmac_key_2
|
||||
auth_cmac_combined_2 = auth_cmac_key_3 + auth_cmac_key_4
|
||||
|
||||
session.derived_keys["enc"] = enc_cmac_key
|
||||
session.derived_keys["auth_1"] = auth_cmac_combined_1
|
||||
session.derived_keys["auth_2"] = auth_cmac_combined_2
|
||||
|
||||
self.logger.debug("verifying license signature")
|
||||
|
||||
lic_hmac = HMAC.new(session.derived_keys["auth_1"], digestmod=SHA256)
|
||||
lic_hmac.update(license.Msg.SerializeToString())
|
||||
|
||||
self.logger.debug(
|
||||
"calculated sig: {} actual sig: {}".format(
|
||||
lic_hmac.hexdigest(), binascii.hexlify(license.Signature)
|
||||
)
|
||||
)
|
||||
|
||||
if lic_hmac.digest() != license.Signature:
|
||||
self.logger.info(
|
||||
"license signature doesn't match - writing bin so they can be debugged"
|
||||
)
|
||||
with open("original_lic.bin", "wb") as f:
|
||||
f.write(base64.b64decode(license_b64))
|
||||
with open("parsed_lic.bin", "wb") as f:
|
||||
f.write(license.SerializeToString())
|
||||
self.logger.info("continuing anyway")
|
||||
|
||||
self.logger.debug("key count: {}".format(len(license.Msg.Key)))
|
||||
for key in license.Msg.Key:
|
||||
if key.Id:
|
||||
key_id = key.Id
|
||||
else:
|
||||
key_id = wv_proto2.License.KeyContainer.KeyType.Name(key.Type).encode(
|
||||
"utf-8"
|
||||
)
|
||||
encrypted_key = key.Key
|
||||
iv = key.Iv
|
||||
type = wv_proto2.License.KeyContainer.KeyType.Name(key.Type)
|
||||
|
||||
cipher = AES.new(session.derived_keys["enc"], AES.MODE_CBC, iv=iv)
|
||||
decrypted_key = cipher.decrypt(encrypted_key)
|
||||
if type == "OPERATOR_SESSION":
|
||||
permissions = []
|
||||
perms = key._OperatorSessionKeyPermissions
|
||||
for (descriptor, value) in perms.ListFields():
|
||||
if value == 1:
|
||||
permissions.append(descriptor.name)
|
||||
# print(permissions)
|
||||
else:
|
||||
permissions = []
|
||||
session.keys.append(
|
||||
Key(key_id, type, Padding.unpad(decrypted_key, 16), permissions)
|
||||
)
|
||||
|
||||
#self.logger.info("decrypted all keys")
|
||||
return 0
|
||||
|
||||
def get_keys(self, session_id):
|
||||
if session_id in self.sessions:
|
||||
return self.sessions[session_id].keys
|
||||
else:
|
||||
self.logger.error("session not found")
|
||||
return 1
|
115
pywidevine/cdm/deviceconfig.py
Normal file
115
pywidevine/cdm/deviceconfig.py
Normal file
@ -0,0 +1,115 @@
|
||||
import os
|
||||
|
||||
device_chromecdm_903 = {
|
||||
"name": "chromecdm_903",
|
||||
"description": "chrome cdm windows 903",
|
||||
"security_level": 3,
|
||||
"session_id_type": "chrome",
|
||||
"private_key_available": True,
|
||||
"vmp": False,
|
||||
"send_key_control_nonce": False,
|
||||
}
|
||||
|
||||
device_android_general = {
|
||||
"name": "android_general",
|
||||
"description": "android_general lvl3 security level",
|
||||
"security_level": 3,
|
||||
"session_id_type": "android",
|
||||
"private_key_available": True,
|
||||
"vmp": False,
|
||||
"send_key_control_nonce": True,
|
||||
}
|
||||
|
||||
devices_available = [
|
||||
device_android_general,
|
||||
device_chromecdm_903,
|
||||
]
|
||||
|
||||
FILES_FOLDER = "devices"
|
||||
|
||||
|
||||
class DeviceConfig:
|
||||
def __init__(self, device):
|
||||
self.device_name = device["name"]
|
||||
self.description = device["description"]
|
||||
self.security_level = device["security_level"]
|
||||
self.session_id_type = device["session_id_type"]
|
||||
self.private_key_available = device["private_key_available"]
|
||||
self.vmp = device["vmp"]
|
||||
self.send_key_control_nonce = device["send_key_control_nonce"]
|
||||
if "keybox_filename" in device:
|
||||
self.keybox_filename = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
FILES_FOLDER,
|
||||
device["name"],
|
||||
device["keybox_filename"],
|
||||
)
|
||||
else:
|
||||
self.keybox_filename = os.path.join(
|
||||
os.path.dirname(__file__), FILES_FOLDER, device["name"], "keybox"
|
||||
)
|
||||
if "device_cert_filename" in device:
|
||||
self.device_cert_filename = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
FILES_FOLDER,
|
||||
device["name"],
|
||||
device["device_cert_filename"],
|
||||
)
|
||||
else:
|
||||
self.device_cert_filename = os.path.join(
|
||||
os.path.dirname(__file__), FILES_FOLDER, device["name"], "device_cert"
|
||||
)
|
||||
if "device_private_key_filename" in device:
|
||||
self.device_private_key_filename = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
FILES_FOLDER,
|
||||
device["name"],
|
||||
device["device_private_key_filename"],
|
||||
)
|
||||
else:
|
||||
self.device_private_key_filename = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
FILES_FOLDER,
|
||||
device["name"],
|
||||
"device_private_key",
|
||||
)
|
||||
if "device_client_id_blob_filename" in device:
|
||||
self.device_client_id_blob_filename = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
FILES_FOLDER,
|
||||
device["name"],
|
||||
device["device_client_id_blob_filename"],
|
||||
)
|
||||
else:
|
||||
self.device_client_id_blob_filename = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
FILES_FOLDER,
|
||||
device["name"],
|
||||
"device_client_id_blob",
|
||||
)
|
||||
if "device_vmp_blob_filename" in device:
|
||||
self.device_vmp_blob_filename = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
FILES_FOLDER,
|
||||
device["name"],
|
||||
device["device_vmp_blob_filename"],
|
||||
)
|
||||
else:
|
||||
self.device_vmp_blob_filename = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
FILES_FOLDER,
|
||||
device["name"],
|
||||
"device_vmp_blob",
|
||||
)
|
||||
|
||||
def __repr__(self):
|
||||
return (
|
||||
"DeviceConfig(name={}, description={}, security_level={}, session_id_type={}, private_key_available={}, vmp={})"
|
||||
).format(
|
||||
self.device_name,
|
||||
self.description,
|
||||
self.security_level,
|
||||
self.session_id_type,
|
||||
self.private_key_available,
|
||||
self.vmp,
|
||||
)
|
BIN
pywidevine/cdm/devices/LenovoTB_X505X/device_client_id_blob
Normal file
BIN
pywidevine/cdm/devices/LenovoTB_X505X/device_client_id_blob
Normal file
Binary file not shown.
27
pywidevine/cdm/devices/LenovoTB_X505X/device_private_key
Normal file
27
pywidevine/cdm/devices/LenovoTB_X505X/device_private_key
Normal file
@ -0,0 +1,27 @@
|
||||
-----BEGIN RSA PRIVATE KEY-----
|
||||
MIIEpAIBAAKCAQEArk8zHaSrt4Cx/XC6CDU1zDEj3izBIrEtBwa8kJsuh5fO3tol
|
||||
JhGaarshuWT9qJMOpD6MPs6eJcywENVz4VN1v4KlS8rpWRbzIZ8Tl/oXj8rb2xnV
|
||||
njqvjgB78Oc/OnPnbSUNlfhepMeIzIviuLMaWzhuNIcSZFrN6fjXqSzNpqO1roGI
|
||||
QM1HB8UYgAd4hHF0ZEZYuLl5ja4HKcCV7eYMVAWyiGEorOGhFKKRfkXEmPruW9Rd
|
||||
XWWZ9wEjf6Qlsaf66rdRBQOXEMWP2ChGSz3PQQ7wpA6YpzNCVIAd8OxoPBPakN+V
|
||||
/13yR23Jyv8S7WzR7vV/TniufteDRH8sJvZMFQIDAQABAoIBABYcX0pLFWovVcSl
|
||||
nD+GymZ2tEtOR4iIS4Mo6Fn6iVYzXE86AjzYPkt8Jdy/0gpkbxbIBV/BM1/tnUbh
|
||||
YLPsM5NBtgiNenit19UDugdM9tirXaSnHAkYfiTn7FDkcImQYsKeUOEdcpn54qE5
|
||||
vF57/6OjHp2kpaFbwGOFyIuB7qtMwqlcYX0vnOttJ0BWwKIIh5gi4yNbciqJKZ/5
|
||||
P6WbKZVvD7rGsRGYroF35pPTy48cWVGYR1OokXZ35FRgX5QVM1bzP2Q0ExxGB5jT
|
||||
7hCmA3DJLooKD2Aj9qX9v5ELMT+bhNIX3Q4aKX7xPZOcBPbugRf8Tuq8l4ThOinQ
|
||||
6eLJaTECgYEA5gsW2SJh8eMV5mLjaW9K6HALhXkeVN4GDzDTLiaTx0VcHXnJC07X
|
||||
nJLx0piw/mR1VoUlXHjfVbtnAN7mDJstqh/rBxhqQrCkXbS10qoRbIt/T+SuNf1s
|
||||
WZcfMj0aPMwz9q2jkbfARV3z+fcJETtEbKDUdwm6ketIpOVwMzDVBvECgYEAwfo1
|
||||
e+X2bzuEGfqmtkFvcrSSLg+OiEhsfgvS0Y2yJEYcUvw+RwH066DBx/tVX1kCFLX8
|
||||
H0/EwIBycKb07bFBnj4gmbqbB2zpo65d3IErByQfuCzW1IymjwG+s+Tx9qGPfRvG
|
||||
TxhoznOOuHO72Xb82B8pxe2/qqOAfIV4zlYJf2UCgYEA2ih0D5E6v5DyqNzo+4ku
|
||||
ycXQN1EIgcVYi7lq3F57UMQnOlDPZyjq8rKsIGLrnyUX3ehA6TQ74GrroPjBw/y5
|
||||
zpecZMszonEwPylsMQ9VnNGh99tPlyXxRfk5/YPSyQuC0BIVh9Bxx5b1E/3BnJTP
|
||||
LBFNzGHujAlMiAyKXhGWRJECgYBFtmd8VKQhS7FpKMS7YX7tKWoTtbGS1vxuvE8S
|
||||
0qrAEJZjWJYFLPXZrNeXyILhFnsB+HlYw3FBgagfRlFmDzs25LsQpJjMrV62XZcM
|
||||
BTvygBAKP8shbj75zDW+LzyqV1vbKZ02ld4svCkBr05GlFXAUkrQAGbOq54kok9N
|
||||
UGxvZQKBgQCGr0rGnCBmKYDUkqZm9FXPDcYWtbiud9zlsaW+xk3UR0fFHuGlK6rW
|
||||
nxV9k5OOJsZL9MUZ4sy8Ob62ToxB23T+02E0QOgzN4tQjqHOUkMO5C/ErT8jFuhY
|
||||
J9H1SW5VOZxxMW5VP+iOXmieAJ94CbjRSI26MGsE+6aeK3SsmSCGYg==
|
||||
-----END RSA PRIVATE KEY-----
|
BIN
pywidevine/cdm/devices/android_general/device_client_id_blob
Normal file
BIN
pywidevine/cdm/devices/android_general/device_client_id_blob
Normal file
Binary file not shown.
27
pywidevine/cdm/devices/android_general/device_private_key
Normal file
27
pywidevine/cdm/devices/android_general/device_private_key
Normal file
@ -0,0 +1,27 @@
|
||||
-----BEGIN RSA PRIVATE KEY-----
|
||||
MIIEpAIBAAKCAQEArk8zHaSrt4Cx/XC6CDU1zDEj3izBIrEtBwa8kJsuh5fO3tol
|
||||
JhGaarshuWT9qJMOpD6MPs6eJcywENVz4VN1v4KlS8rpWRbzIZ8Tl/oXj8rb2xnV
|
||||
njqvjgB78Oc/OnPnbSUNlfhepMeIzIviuLMaWzhuNIcSZFrN6fjXqSzNpqO1roGI
|
||||
QM1HB8UYgAd4hHF0ZEZYuLl5ja4HKcCV7eYMVAWyiGEorOGhFKKRfkXEmPruW9Rd
|
||||
XWWZ9wEjf6Qlsaf66rdRBQOXEMWP2ChGSz3PQQ7wpA6YpzNCVIAd8OxoPBPakN+V
|
||||
/13yR23Jyv8S7WzR7vV/TniufteDRH8sJvZMFQIDAQABAoIBABYcX0pLFWovVcSl
|
||||
nD+GymZ2tEtOR4iIS4Mo6Fn6iVYzXE86AjzYPkt8Jdy/0gpkbxbIBV/BM1/tnUbh
|
||||
YLPsM5NBtgiNenit19UDugdM9tirXaSnHAkYfiTn7FDkcImQYsKeUOEdcpn54qE5
|
||||
vF57/6OjHp2kpaFbwGOFyIuB7qtMwqlcYX0vnOttJ0BWwKIIh5gi4yNbciqJKZ/5
|
||||
P6WbKZVvD7rGsRGYroF35pPTy48cWVGYR1OokXZ35FRgX5QVM1bzP2Q0ExxGB5jT
|
||||
7hCmA3DJLooKD2Aj9qX9v5ELMT+bhNIX3Q4aKX7xPZOcBPbugRf8Tuq8l4ThOinQ
|
||||
6eLJaTECgYEA5gsW2SJh8eMV5mLjaW9K6HALhXkeVN4GDzDTLiaTx0VcHXnJC07X
|
||||
nJLx0piw/mR1VoUlXHjfVbtnAN7mDJstqh/rBxhqQrCkXbS10qoRbIt/T+SuNf1s
|
||||
WZcfMj0aPMwz9q2jkbfARV3z+fcJETtEbKDUdwm6ketIpOVwMzDVBvECgYEAwfo1
|
||||
e+X2bzuEGfqmtkFvcrSSLg+OiEhsfgvS0Y2yJEYcUvw+RwH066DBx/tVX1kCFLX8
|
||||
H0/EwIBycKb07bFBnj4gmbqbB2zpo65d3IErByQfuCzW1IymjwG+s+Tx9qGPfRvG
|
||||
TxhoznOOuHO72Xb82B8pxe2/qqOAfIV4zlYJf2UCgYEA2ih0D5E6v5DyqNzo+4ku
|
||||
ycXQN1EIgcVYi7lq3F57UMQnOlDPZyjq8rKsIGLrnyUX3ehA6TQ74GrroPjBw/y5
|
||||
zpecZMszonEwPylsMQ9VnNGh99tPlyXxRfk5/YPSyQuC0BIVh9Bxx5b1E/3BnJTP
|
||||
LBFNzGHujAlMiAyKXhGWRJECgYBFtmd8VKQhS7FpKMS7YX7tKWoTtbGS1vxuvE8S
|
||||
0qrAEJZjWJYFLPXZrNeXyILhFnsB+HlYw3FBgagfRlFmDzs25LsQpJjMrV62XZcM
|
||||
BTvygBAKP8shbj75zDW+LzyqV1vbKZ02ld4svCkBr05GlFXAUkrQAGbOq54kok9N
|
||||
UGxvZQKBgQCGr0rGnCBmKYDUkqZm9FXPDcYWtbiud9zlsaW+xk3UR0fFHuGlK6rW
|
||||
nxV9k5OOJsZL9MUZ4sy8Ob62ToxB23T+02E0QOgzN4tQjqHOUkMO5C/ErT8jFuhY
|
||||
J9H1SW5VOZxxMW5VP+iOXmieAJ94CbjRSI26MGsE+6aeK3SsmSCGYg==
|
||||
-----END RSA PRIVATE KEY-----
|
0
pywidevine/cdm/formats/__init__.py
Normal file
0
pywidevine/cdm/formats/__init__.py
Normal file
466
pywidevine/cdm/formats/wv_proto2.proto
Normal file
466
pywidevine/cdm/formats/wv_proto2.proto
Normal file
@ -0,0 +1,466 @@
|
||||
syntax = "proto2";
|
||||
|
||||
// from x86 (partial), most of it from the ARM version:
|
||||
message ClientIdentification {
|
||||
enum TokenType {
|
||||
KEYBOX = 0;
|
||||
DEVICE_CERTIFICATE = 1;
|
||||
REMOTE_ATTESTATION_CERTIFICATE = 2;
|
||||
}
|
||||
message NameValue {
|
||||
required string Name = 1;
|
||||
required string Value = 2;
|
||||
}
|
||||
message ClientCapabilities {
|
||||
enum HdcpVersion {
|
||||
HDCP_NONE = 0;
|
||||
HDCP_V1 = 1;
|
||||
HDCP_V2 = 2;
|
||||
HDCP_V2_1 = 3;
|
||||
HDCP_V2_2 = 4;
|
||||
}
|
||||
optional uint32 ClientToken = 1;
|
||||
optional uint32 SessionToken = 2;
|
||||
optional uint32 VideoResolutionConstraints = 3;
|
||||
optional HdcpVersion MaxHdcpVersion = 4;
|
||||
optional uint32 OemCryptoApiVersion = 5;
|
||||
}
|
||||
required TokenType Type = 1;
|
||||
//optional bytes Token = 2; // by default the client treats this as blob, but it's usually a DeviceCertificate, so for usefulness sake, I'm replacing it with this one:
|
||||
optional SignedDeviceCertificate Token = 2; // use this when parsing, "bytes" when building a client id blob
|
||||
repeated NameValue ClientInfo = 3;
|
||||
optional bytes ProviderClientToken = 4;
|
||||
optional uint32 LicenseCounter = 5;
|
||||
optional ClientCapabilities _ClientCapabilities = 6; // how should we deal with duped names? will have to look at proto docs later
|
||||
optional FileHashes _FileHashes = 7; // vmp blob goes here
|
||||
}
|
||||
|
||||
message DeviceCertificate {
|
||||
enum CertificateType {
|
||||
ROOT = 0;
|
||||
INTERMEDIATE = 1;
|
||||
USER_DEVICE = 2;
|
||||
SERVICE = 3;
|
||||
}
|
||||
required CertificateType Type = 1; // the compiled code reused this as ProvisionedDeviceInfo.WvSecurityLevel, however that is incorrect (compiler aliased it as they're both identical as a structure)
|
||||
optional bytes SerialNumber = 2;
|
||||
optional uint32 CreationTimeSeconds = 3;
|
||||
optional bytes PublicKey = 4;
|
||||
optional uint32 SystemId = 5;
|
||||
optional uint32 TestDeviceDeprecated = 6; // is it bool or int?
|
||||
optional bytes ServiceId = 7; // service URL for service certificates
|
||||
}
|
||||
|
||||
// missing some references,
|
||||
message DeviceCertificateStatus {
|
||||
enum CertificateStatus {
|
||||
VALID = 0;
|
||||
REVOKED = 1;
|
||||
}
|
||||
optional bytes SerialNumber = 1;
|
||||
optional CertificateStatus Status = 2;
|
||||
optional ProvisionedDeviceInfo DeviceInfo = 4; // where is 3? is it deprecated?
|
||||
}
|
||||
|
||||
message DeviceCertificateStatusList {
|
||||
optional uint32 CreationTimeSeconds = 1;
|
||||
repeated DeviceCertificateStatus CertificateStatus = 2;
|
||||
}
|
||||
|
||||
message EncryptedClientIdentification {
|
||||
required string ServiceId = 1;
|
||||
optional bytes ServiceCertificateSerialNumber = 2;
|
||||
required bytes EncryptedClientId = 3;
|
||||
required bytes EncryptedClientIdIv = 4;
|
||||
required bytes EncryptedPrivacyKey = 5;
|
||||
}
|
||||
|
||||
// todo: fill (for this top-level type, it might be impossible/difficult)
|
||||
enum LicenseType {
|
||||
ZERO = 0;
|
||||
DEFAULT = 1; // 1 is STREAMING/temporary license; on recent versions may go up to 3 (latest x86); it might be persist/don't persist type, unconfirmed
|
||||
OFFLINE = 2;
|
||||
}
|
||||
|
||||
// todo: fill (for this top-level type, it might be impossible/difficult)
|
||||
// this is just a guess because these globals got lost, but really, do we need more?
|
||||
enum ProtocolVersion {
|
||||
CURRENT = 21; // don't have symbols for this
|
||||
}
|
||||
|
||||
|
||||
message LicenseIdentification {
|
||||
optional bytes RequestId = 1;
|
||||
optional bytes SessionId = 2;
|
||||
optional bytes PurchaseId = 3;
|
||||
optional LicenseType Type = 4;
|
||||
optional uint32 Version = 5;
|
||||
optional bytes ProviderSessionToken = 6;
|
||||
}
|
||||
|
||||
|
||||
message License {
|
||||
message Policy {
|
||||
optional bool CanPlay = 1; // changed from uint32 to bool
|
||||
optional bool CanPersist = 2;
|
||||
optional bool CanRenew = 3;
|
||||
optional uint32 RentalDurationSeconds = 4;
|
||||
optional uint32 PlaybackDurationSeconds = 5;
|
||||
optional uint32 LicenseDurationSeconds = 6;
|
||||
optional uint32 RenewalRecoveryDurationSeconds = 7;
|
||||
optional string RenewalServerUrl = 8;
|
||||
optional uint32 RenewalDelaySeconds = 9;
|
||||
optional uint32 RenewalRetryIntervalSeconds = 10;
|
||||
optional bool RenewWithUsage = 11; // was uint32
|
||||
}
|
||||
message KeyContainer {
|
||||
enum KeyType {
|
||||
SIGNING = 1;
|
||||
CONTENT = 2;
|
||||
KEY_CONTROL = 3;
|
||||
OPERATOR_SESSION = 4;
|
||||
}
|
||||
enum SecurityLevel {
|
||||
SW_SECURE_CRYPTO = 1;
|
||||
SW_SECURE_DECODE = 2;
|
||||
HW_SECURE_CRYPTO = 3;
|
||||
HW_SECURE_DECODE = 4;
|
||||
HW_SECURE_ALL = 5;
|
||||
}
|
||||
message OutputProtection {
|
||||
enum CGMS {
|
||||
COPY_FREE = 0;
|
||||
COPY_ONCE = 2;
|
||||
COPY_NEVER = 3;
|
||||
CGMS_NONE = 0x2A; // PC default!
|
||||
}
|
||||
optional ClientIdentification.ClientCapabilities.HdcpVersion Hdcp = 1; // it's most likely a copy of Hdcp version available here, but compiler optimized it away
|
||||
optional CGMS CgmsFlags = 2;
|
||||
}
|
||||
message KeyControl {
|
||||
required bytes KeyControlBlock = 1; // what is this?
|
||||
required bytes Iv = 2;
|
||||
}
|
||||
message OperatorSessionKeyPermissions {
|
||||
optional uint32 AllowEncrypt = 1;
|
||||
optional uint32 AllowDecrypt = 2;
|
||||
optional uint32 AllowSign = 3;
|
||||
optional uint32 AllowSignatureVerify = 4;
|
||||
}
|
||||
message VideoResolutionConstraint {
|
||||
optional uint32 MinResolutionPixels = 1;
|
||||
optional uint32 MaxResolutionPixels = 2;
|
||||
optional OutputProtection RequiredProtection = 3;
|
||||
}
|
||||
optional bytes Id = 1;
|
||||
optional bytes Iv = 2;
|
||||
optional bytes Key = 3;
|
||||
optional KeyType Type = 4;
|
||||
optional SecurityLevel Level = 5;
|
||||
optional OutputProtection RequiredProtection = 6;
|
||||
optional OutputProtection RequestedProtection = 7;
|
||||
optional KeyControl _KeyControl = 8; // duped names, etc
|
||||
optional OperatorSessionKeyPermissions _OperatorSessionKeyPermissions = 9; // duped names, etc
|
||||
repeated VideoResolutionConstraint VideoResolutionConstraints = 10;
|
||||
}
|
||||
optional LicenseIdentification Id = 1;
|
||||
optional Policy _Policy = 2; // duped names, etc
|
||||
repeated KeyContainer Key = 3;
|
||||
optional uint32 LicenseStartTime = 4;
|
||||
optional uint32 RemoteAttestationVerified = 5; // bool?
|
||||
optional bytes ProviderClientToken = 6;
|
||||
// there might be more, check with newer versions (I see field 7-8 in a lic)
|
||||
// this appeared in latest x86:
|
||||
optional uint32 ProtectionScheme = 7; // type unconfirmed fully, but it's likely as WidevineCencHeader describesit (fourcc)
|
||||
}
|
||||
|
||||
message LicenseError {
|
||||
enum Error {
|
||||
INVALID_DEVICE_CERTIFICATE = 1;
|
||||
REVOKED_DEVICE_CERTIFICATE = 2;
|
||||
SERVICE_UNAVAILABLE = 3;
|
||||
}
|
||||
//LicenseRequest.RequestType ErrorCode; // clang mismatch
|
||||
optional Error ErrorCode = 1;
|
||||
}
|
||||
|
||||
message LicenseRequest {
|
||||
message ContentIdentification {
|
||||
message CENC {
|
||||
//optional bytes Pssh = 1; // the client's definition is opaque, it doesn't care about the contents, but the PSSH has a clear definition that is understood and requested by the server, thus I'll replace it with:
|
||||
optional WidevineCencHeader Pssh = 1;
|
||||
optional LicenseType LicenseType = 2; // unfortunately the LicenseType symbols are not present, acceptable value seems to only be 1 (is this persist/don't persist? look into it!)
|
||||
optional bytes RequestId = 3;
|
||||
}
|
||||
message WebM {
|
||||
optional bytes Header = 1; // identical to CENC, aside from PSSH and the parent field number used
|
||||
optional LicenseType LicenseType = 2;
|
||||
optional bytes RequestId = 3;
|
||||
}
|
||||
message ExistingLicense {
|
||||
optional LicenseIdentification LicenseId = 1;
|
||||
optional uint32 SecondsSinceStarted = 2;
|
||||
optional uint32 SecondsSinceLastPlayed = 3;
|
||||
optional bytes SessionUsageTableEntry = 4; // interesting! try to figure out the connection between the usage table blob and KCB!
|
||||
}
|
||||
optional CENC CencId = 1;
|
||||
optional WebM WebmId = 2;
|
||||
optional ExistingLicense License = 3;
|
||||
}
|
||||
enum RequestType {
|
||||
NEW = 1;
|
||||
RENEWAL = 2;
|
||||
RELEASE = 3;
|
||||
}
|
||||
optional ClientIdentification ClientId = 1;
|
||||
optional ContentIdentification ContentId = 2;
|
||||
optional RequestType Type = 3;
|
||||
optional uint32 RequestTime = 4;
|
||||
optional bytes KeyControlNonceDeprecated = 5;
|
||||
optional ProtocolVersion ProtocolVersion = 6; // lacking symbols for this
|
||||
optional uint32 KeyControlNonce = 7;
|
||||
optional EncryptedClientIdentification EncryptedClientId = 8;
|
||||
}
|
||||
|
||||
// raw pssh hack
|
||||
message LicenseRequestRaw {
|
||||
message ContentIdentification {
|
||||
message CENC {
|
||||
optional bytes Pssh = 1; // the client's definition is opaque, it doesn't care about the contents, but the PSSH has a clear definition that is understood and requested by the server, thus I'll replace it with:
|
||||
//optional WidevineCencHeader Pssh = 1;
|
||||
optional LicenseType LicenseType = 2; // unfortunately the LicenseType symbols are not present, acceptable value seems to only be 1 (is this persist/don't persist? look into it!)
|
||||
optional bytes RequestId = 3;
|
||||
}
|
||||
message WebM {
|
||||
optional bytes Header = 1; // identical to CENC, aside from PSSH and the parent field number used
|
||||
optional LicenseType LicenseType = 2;
|
||||
optional bytes RequestId = 3;
|
||||
}
|
||||
message ExistingLicense {
|
||||
optional LicenseIdentification LicenseId = 1;
|
||||
optional uint32 SecondsSinceStarted = 2;
|
||||
optional uint32 SecondsSinceLastPlayed = 3;
|
||||
optional bytes SessionUsageTableEntry = 4; // interesting! try to figure out the connection between the usage table blob and KCB!
|
||||
}
|
||||
optional CENC CencId = 1;
|
||||
optional WebM WebmId = 2;
|
||||
optional ExistingLicense License = 3;
|
||||
}
|
||||
enum RequestType {
|
||||
NEW = 1;
|
||||
RENEWAL = 2;
|
||||
RELEASE = 3;
|
||||
}
|
||||
optional ClientIdentification ClientId = 1;
|
||||
optional ContentIdentification ContentId = 2;
|
||||
optional RequestType Type = 3;
|
||||
optional uint32 RequestTime = 4;
|
||||
optional bytes KeyControlNonceDeprecated = 5;
|
||||
optional ProtocolVersion ProtocolVersion = 6; // lacking symbols for this
|
||||
optional uint32 KeyControlNonce = 7;
|
||||
optional EncryptedClientIdentification EncryptedClientId = 8;
|
||||
}
|
||||
|
||||
|
||||
message ProvisionedDeviceInfo {
|
||||
enum WvSecurityLevel {
|
||||
LEVEL_UNSPECIFIED = 0;
|
||||
LEVEL_1 = 1;
|
||||
LEVEL_2 = 2;
|
||||
LEVEL_3 = 3;
|
||||
}
|
||||
optional uint32 SystemId = 1;
|
||||
optional string Soc = 2;
|
||||
optional string Manufacturer = 3;
|
||||
optional string Model = 4;
|
||||
optional string DeviceType = 5;
|
||||
optional uint32 ModelYear = 6;
|
||||
optional WvSecurityLevel SecurityLevel = 7;
|
||||
optional uint32 TestDevice = 8; // bool?
|
||||
}
|
||||
|
||||
|
||||
// todo: fill
|
||||
message ProvisioningOptions {
|
||||
}
|
||||
|
||||
// todo: fill
|
||||
message ProvisioningRequest {
|
||||
}
|
||||
|
||||
// todo: fill
|
||||
message ProvisioningResponse {
|
||||
}
|
||||
|
||||
message RemoteAttestation {
|
||||
optional EncryptedClientIdentification Certificate = 1;
|
||||
optional string Salt = 2;
|
||||
optional string Signature = 3;
|
||||
}
|
||||
|
||||
// todo: fill
|
||||
message SessionInit {
|
||||
}
|
||||
|
||||
// todo: fill
|
||||
message SessionState {
|
||||
}
|
||||
|
||||
// todo: fill
|
||||
message SignedCertificateStatusList {
|
||||
}
|
||||
|
||||
message SignedDeviceCertificate {
|
||||
|
||||
//optional bytes DeviceCertificate = 1; // again, they use a buffer where it's supposed to be a message, so we'll replace it with what it really is:
|
||||
optional DeviceCertificate _DeviceCertificate = 1; // how should we deal with duped names? will have to look at proto docs later
|
||||
optional bytes Signature = 2;
|
||||
optional SignedDeviceCertificate Signer = 3;
|
||||
}
|
||||
|
||||
|
||||
// todo: fill
|
||||
message SignedProvisioningMessage {
|
||||
}
|
||||
|
||||
// the root of all messages, from either server or client
|
||||
message SignedMessage {
|
||||
enum MessageType {
|
||||
LICENSE_REQUEST = 1;
|
||||
LICENSE = 2;
|
||||
ERROR_RESPONSE = 3;
|
||||
SERVICE_CERTIFICATE_REQUEST = 4;
|
||||
SERVICE_CERTIFICATE = 5;
|
||||
}
|
||||
optional MessageType Type = 1; // has in incorrect overlap with License_KeyContainer_SecurityLevel
|
||||
optional bytes Msg = 2; // this has to be casted dynamically, to LicenseRequest, License or LicenseError (? unconfirmed), for Request, no other fields but Type need to be present
|
||||
// for SERVICE_CERTIFICATE, only Type and Msg are present, and it's just a DeviceCertificate with CertificateType set to SERVICE
|
||||
optional bytes Signature = 3; // might be different type of signatures (ex. RSA vs AES CMAC(??), unconfirmed for now)
|
||||
optional bytes SessionKey = 4; // often RSA wrapped for licenses
|
||||
optional RemoteAttestation RemoteAttestation = 5;
|
||||
}
|
||||
|
||||
|
||||
|
||||
// This message is copied from google's docs, not reversed:
|
||||
message WidevineCencHeader {
|
||||
enum Algorithm {
|
||||
UNENCRYPTED = 0;
|
||||
AESCTR = 1;
|
||||
};
|
||||
optional Algorithm algorithm = 1;
|
||||
repeated bytes key_id = 2;
|
||||
|
||||
// Content provider name.
|
||||
optional string provider = 3;
|
||||
|
||||
// A content identifier, specified by content provider.
|
||||
optional bytes content_id = 4;
|
||||
|
||||
// Track type. Acceptable values are SD, HD and AUDIO. Used to
|
||||
// differentiate content keys used by an asset.
|
||||
optional string track_type_deprecated = 5;
|
||||
|
||||
// The name of a registered policy to be used for this asset.
|
||||
optional string policy = 6;
|
||||
|
||||
// Crypto period index, for media using key rotation.
|
||||
optional uint32 crypto_period_index = 7;
|
||||
|
||||
// Optional protected context for group content. The grouped_license is a
|
||||
// serialized SignedMessage.
|
||||
optional bytes grouped_license = 8;
|
||||
|
||||
// Protection scheme identifying the encryption algorithm.
|
||||
// Represented as one of the following 4CC values:
|
||||
// 'cenc' (AESCTR), 'cbc1' (AESCBC),
|
||||
// 'cens' (AESCTR subsample), 'cbcs' (AESCBC subsample).
|
||||
optional uint32 protection_scheme = 9;
|
||||
|
||||
// Optional. For media using key rotation, this represents the duration
|
||||
// of each crypto period in seconds.
|
||||
optional uint32 crypto_period_seconds = 10;
|
||||
}
|
||||
|
||||
|
||||
// remove these when using it outside of protoc:
|
||||
|
||||
// from here on, it's just for testing, these messages don't exist in the binaries, I'm adding them to avoid detecting type programmatically
|
||||
message SignedLicenseRequest {
|
||||
enum MessageType {
|
||||
LICENSE_REQUEST = 1;
|
||||
LICENSE = 2;
|
||||
ERROR_RESPONSE = 3;
|
||||
SERVICE_CERTIFICATE_REQUEST = 4;
|
||||
SERVICE_CERTIFICATE = 5;
|
||||
}
|
||||
optional MessageType Type = 1; // has in incorrect overlap with License_KeyContainer_SecurityLevel
|
||||
optional LicenseRequest Msg = 2; // this has to be casted dynamically, to LicenseRequest, License or LicenseError (? unconfirmed), for Request, no other fields but Type need to be present
|
||||
// for SERVICE_CERTIFICATE, only Type and Msg are present, and it's just a DeviceCertificate with CertificateType set to SERVICE
|
||||
optional bytes Signature = 3; // might be different type of signatures (ex. RSA vs AES CMAC(??), unconfirmed for now)
|
||||
optional bytes SessionKey = 4; // often RSA wrapped for licenses
|
||||
optional RemoteAttestation RemoteAttestation = 5;
|
||||
}
|
||||
|
||||
// hack
|
||||
message SignedLicenseRequestRaw {
|
||||
enum MessageType {
|
||||
LICENSE_REQUEST = 1;
|
||||
LICENSE = 2;
|
||||
ERROR_RESPONSE = 3;
|
||||
SERVICE_CERTIFICATE_REQUEST = 4;
|
||||
SERVICE_CERTIFICATE = 5;
|
||||
}
|
||||
optional MessageType Type = 1; // has in incorrect overlap with License_KeyContainer_SecurityLevel
|
||||
optional LicenseRequestRaw Msg = 2; // this has to be casted dynamically, to LicenseRequest, License or LicenseError (? unconfirmed), for Request, no other fields but Type need to be present
|
||||
// for SERVICE_CERTIFICATE, only Type and Msg are present, and it's just a DeviceCertificate with CertificateType set to SERVICE
|
||||
optional bytes Signature = 3; // might be different type of signatures (ex. RSA vs AES CMAC(??), unconfirmed for now)
|
||||
optional bytes SessionKey = 4; // often RSA wrapped for licenses
|
||||
optional RemoteAttestation RemoteAttestation = 5;
|
||||
}
|
||||
|
||||
|
||||
message SignedLicense {
|
||||
enum MessageType {
|
||||
LICENSE_REQUEST = 1;
|
||||
LICENSE = 2;
|
||||
ERROR_RESPONSE = 3;
|
||||
SERVICE_CERTIFICATE_REQUEST = 4;
|
||||
SERVICE_CERTIFICATE = 5;
|
||||
}
|
||||
optional MessageType Type = 1; // has in incorrect overlap with License_KeyContainer_SecurityLevel
|
||||
optional License Msg = 2; // this has to be casted dynamically, to LicenseRequest, License or LicenseError (? unconfirmed), for Request, no other fields but Type need to be present
|
||||
// for SERVICE_CERTIFICATE, only Type and Msg are present, and it's just a DeviceCertificate with CertificateType set to SERVICE
|
||||
optional bytes Signature = 3; // might be different type of signatures (ex. RSA vs AES CMAC(??), unconfirmed for now)
|
||||
optional bytes SessionKey = 4; // often RSA wrapped for licenses
|
||||
optional RemoteAttestation RemoteAttestation = 5;
|
||||
}
|
||||
|
||||
message SignedServiceCertificate {
|
||||
enum MessageType {
|
||||
LICENSE_REQUEST = 1;
|
||||
LICENSE = 2;
|
||||
ERROR_RESPONSE = 3;
|
||||
SERVICE_CERTIFICATE_REQUEST = 4;
|
||||
SERVICE_CERTIFICATE = 5;
|
||||
}
|
||||
optional MessageType Type = 1; // has in incorrect overlap with License_KeyContainer_SecurityLevel
|
||||
optional SignedDeviceCertificate Msg = 2; // this has to be casted dynamically, to LicenseRequest, License or LicenseError (? unconfirmed), for Request, no other fields but Type need to be present
|
||||
// for SERVICE_CERTIFICATE, only Type and Msg are present, and it's just a DeviceCertificate with CertificateType set to SERVICE
|
||||
optional bytes Signature = 3; // might be different type of signatures (ex. RSA vs AES CMAC(??), unconfirmed for now)
|
||||
optional bytes SessionKey = 4; // often RSA wrapped for licenses
|
||||
optional RemoteAttestation RemoteAttestation = 5;
|
||||
}
|
||||
|
||||
//vmp support
|
||||
message FileHashes {
|
||||
message Signature {
|
||||
optional string filename = 1;
|
||||
optional bool test_signing = 2; //0 - release, 1 - testing
|
||||
optional bytes SHA512Hash = 3;
|
||||
optional bool main_exe = 4; //0 for dlls, 1 for exe, this is field 3 in file
|
||||
optional bytes signature = 5;
|
||||
}
|
||||
optional bytes signer = 1;
|
||||
repeated Signature signatures = 2;
|
||||
}
|
3324
pywidevine/cdm/formats/wv_proto2_pb2.py
Normal file
3324
pywidevine/cdm/formats/wv_proto2_pb2.py
Normal file
File diff suppressed because one or more lines are too long
389
pywidevine/cdm/formats/wv_proto3.proto
Normal file
389
pywidevine/cdm/formats/wv_proto3.proto
Normal file
@ -0,0 +1,389 @@
|
||||
// beware proto3 won't show missing fields it seems, need to change to "proto2" and add "optional" before every field, and remove all the dummy enum members I added:
|
||||
syntax = "proto3";
|
||||
|
||||
// from x86 (partial), most of it from the ARM version:
|
||||
message ClientIdentification {
|
||||
enum TokenType {
|
||||
KEYBOX = 0;
|
||||
DEVICE_CERTIFICATE = 1;
|
||||
REMOTE_ATTESTATION_CERTIFICATE = 2;
|
||||
}
|
||||
message NameValue {
|
||||
string Name = 1;
|
||||
string Value = 2;
|
||||
}
|
||||
message ClientCapabilities {
|
||||
enum HdcpVersion {
|
||||
HDCP_NONE = 0;
|
||||
HDCP_V1 = 1;
|
||||
HDCP_V2 = 2;
|
||||
HDCP_V2_1 = 3;
|
||||
HDCP_V2_2 = 4;
|
||||
}
|
||||
uint32 ClientToken = 1;
|
||||
uint32 SessionToken = 2;
|
||||
uint32 VideoResolutionConstraints = 3;
|
||||
HdcpVersion MaxHdcpVersion = 4;
|
||||
uint32 OemCryptoApiVersion = 5;
|
||||
}
|
||||
TokenType Type = 1;
|
||||
//bytes Token = 2; // by default the client treats this as blob, but it's usually a DeviceCertificate, so for usefulness sake, I'm replacing it with this one:
|
||||
SignedDeviceCertificate Token = 2;
|
||||
repeated NameValue ClientInfo = 3;
|
||||
bytes ProviderClientToken = 4;
|
||||
uint32 LicenseCounter = 5;
|
||||
ClientCapabilities _ClientCapabilities = 6; // how should we deal with duped names? will have to look at proto docs later
|
||||
}
|
||||
|
||||
message DeviceCertificate {
|
||||
enum CertificateType {
|
||||
ROOT = 0;
|
||||
INTERMEDIATE = 1;
|
||||
USER_DEVICE = 2;
|
||||
SERVICE = 3;
|
||||
}
|
||||
//ProvisionedDeviceInfo.WvSecurityLevel Type = 1; // is this how one is supposed to call it? (it's an enum) there might be a bug here, with CertificateType getting confused with WvSecurityLevel, for now renaming it (verify against other binaries)
|
||||
CertificateType Type = 1;
|
||||
bytes SerialNumber = 2;
|
||||
uint32 CreationTimeSeconds = 3;
|
||||
bytes PublicKey = 4;
|
||||
uint32 SystemId = 5;
|
||||
uint32 TestDeviceDeprecated = 6; // is it bool or int?
|
||||
bytes ServiceId = 7; // service URL for service certificates
|
||||
}
|
||||
|
||||
// missing some references,
|
||||
message DeviceCertificateStatus {
|
||||
enum CertificateStatus {
|
||||
VALID = 0;
|
||||
REVOKED = 1;
|
||||
}
|
||||
bytes SerialNumber = 1;
|
||||
CertificateStatus Status = 2;
|
||||
ProvisionedDeviceInfo DeviceInfo = 4; // where is 3? is it deprecated?
|
||||
}
|
||||
|
||||
message DeviceCertificateStatusList {
|
||||
uint32 CreationTimeSeconds = 1;
|
||||
repeated DeviceCertificateStatus CertificateStatus = 2;
|
||||
}
|
||||
|
||||
message EncryptedClientIdentification {
|
||||
string ServiceId = 1;
|
||||
bytes ServiceCertificateSerialNumber = 2;
|
||||
bytes EncryptedClientId = 3;
|
||||
bytes EncryptedClientIdIv = 4;
|
||||
bytes EncryptedPrivacyKey = 5;
|
||||
}
|
||||
|
||||
// todo: fill (for this top-level type, it might be impossible/difficult)
|
||||
enum LicenseType {
|
||||
ZERO = 0;
|
||||
DEFAULT = 1; // do not know what this is either, but should be 1; on recent versions may go up to 3 (latest x86)
|
||||
}
|
||||
|
||||
// todo: fill (for this top-level type, it might be impossible/difficult)
|
||||
// this is just a guess because these globals got lost, but really, do we need more?
|
||||
enum ProtocolVersion {
|
||||
DUMMY = 0;
|
||||
CURRENT = 21; // don't have symbols for this
|
||||
}
|
||||
|
||||
|
||||
message LicenseIdentification {
|
||||
bytes RequestId = 1;
|
||||
bytes SessionId = 2;
|
||||
bytes PurchaseId = 3;
|
||||
LicenseType Type = 4;
|
||||
uint32 Version = 5;
|
||||
bytes ProviderSessionToken = 6;
|
||||
}
|
||||
|
||||
|
||||
message License {
|
||||
message Policy {
|
||||
uint32 CanPlay = 1;
|
||||
uint32 CanPersist = 2;
|
||||
uint32 CanRenew = 3;
|
||||
uint32 RentalDurationSeconds = 4;
|
||||
uint32 PlaybackDurationSeconds = 5;
|
||||
uint32 LicenseDurationSeconds = 6;
|
||||
uint32 RenewalRecoveryDurationSeconds = 7;
|
||||
string RenewalServerUrl = 8;
|
||||
uint32 RenewalDelaySeconds = 9;
|
||||
uint32 RenewalRetryIntervalSeconds = 10;
|
||||
uint32 RenewWithUsage = 11;
|
||||
uint32 UnknownPolicy12 = 12;
|
||||
}
|
||||
message KeyContainer {
|
||||
enum KeyType {
|
||||
_NOKEYTYPE = 0; // dummy, added to satisfy proto3, not present in original
|
||||
SIGNING = 1;
|
||||
CONTENT = 2;
|
||||
KEY_CONTROL = 3;
|
||||
OPERATOR_SESSION = 4;
|
||||
}
|
||||
enum SecurityLevel {
|
||||
_NOSECLEVEL = 0; // dummy, added to satisfy proto3, not present in original
|
||||
SW_SECURE_CRYPTO = 1;
|
||||
SW_SECURE_DECODE = 2;
|
||||
HW_SECURE_CRYPTO = 3;
|
||||
HW_SECURE_DECODE = 4;
|
||||
HW_SECURE_ALL = 5;
|
||||
}
|
||||
message OutputProtection {
|
||||
enum CGMS {
|
||||
COPY_FREE = 0;
|
||||
COPY_ONCE = 2;
|
||||
COPY_NEVER = 3;
|
||||
CGMS_NONE = 0x2A; // PC default!
|
||||
}
|
||||
ClientIdentification.ClientCapabilities.HdcpVersion Hdcp = 1; // it's most likely a copy of Hdcp version available here, but compiler optimized it away
|
||||
CGMS CgmsFlags = 2;
|
||||
}
|
||||
message KeyControl {
|
||||
bytes KeyControlBlock = 1; // what is this?
|
||||
bytes Iv = 2;
|
||||
}
|
||||
message OperatorSessionKeyPermissions {
|
||||
uint32 AllowEncrypt = 1;
|
||||
uint32 AllowDecrypt = 2;
|
||||
uint32 AllowSign = 3;
|
||||
uint32 AllowSignatureVerify = 4;
|
||||
}
|
||||
message VideoResolutionConstraint {
|
||||
uint32 MinResolutionPixels = 1;
|
||||
uint32 MaxResolutionPixels = 2;
|
||||
OutputProtection RequiredProtection = 3;
|
||||
}
|
||||
bytes Id = 1;
|
||||
bytes Iv = 2;
|
||||
bytes Key = 3;
|
||||
KeyType Type = 4;
|
||||
SecurityLevel Level = 5;
|
||||
OutputProtection RequiredProtection = 6;
|
||||
OutputProtection RequestedProtection = 7;
|
||||
KeyControl _KeyControl = 8; // duped names, etc
|
||||
OperatorSessionKeyPermissions _OperatorSessionKeyPermissions = 9; // duped names, etc
|
||||
repeated VideoResolutionConstraint VideoResolutionConstraints = 10;
|
||||
}
|
||||
LicenseIdentification Id = 1;
|
||||
Policy _Policy = 2; // duped names, etc
|
||||
repeated KeyContainer Key = 3;
|
||||
uint32 LicenseStartTime = 4;
|
||||
uint32 RemoteAttestationVerified = 5; // bool?
|
||||
bytes ProviderClientToken = 6;
|
||||
// there might be more, check with newer versions (I see field 7-8 in a lic)
|
||||
// this appeared in latest x86:
|
||||
uint32 ProtectionScheme = 7; // type unconfirmed fully, but it's likely as WidevineCencHeader describesit (fourcc)
|
||||
bytes UnknownHdcpDataField = 8;
|
||||
}
|
||||
|
||||
message LicenseError {
|
||||
enum Error {
|
||||
DUMMY_NO_ERROR = 0; // dummy, added to satisfy proto3
|
||||
INVALID_DEVICE_CERTIFICATE = 1;
|
||||
REVOKED_DEVICE_CERTIFICATE = 2;
|
||||
SERVICE_UNAVAILABLE = 3;
|
||||
}
|
||||
//LicenseRequest.RequestType ErrorCode; // clang mismatch
|
||||
Error ErrorCode = 1;
|
||||
}
|
||||
|
||||
message LicenseRequest {
|
||||
message ContentIdentification {
|
||||
message CENC {
|
||||
// bytes Pssh = 1; // the client's definition is opaque, it doesn't care about the contents, but the PSSH has a clear definition that is understood and requested by the server, thus I'll replace it with:
|
||||
WidevineCencHeader Pssh = 1;
|
||||
LicenseType LicenseType = 2; // unfortunately the LicenseType symbols are not present, acceptable value seems to only be 1
|
||||
bytes RequestId = 3;
|
||||
}
|
||||
message WebM {
|
||||
bytes Header = 1; // identical to CENC, aside from PSSH and the parent field number used
|
||||
LicenseType LicenseType = 2;
|
||||
bytes RequestId = 3;
|
||||
}
|
||||
message ExistingLicense {
|
||||
LicenseIdentification LicenseId = 1;
|
||||
uint32 SecondsSinceStarted = 2;
|
||||
uint32 SecondsSinceLastPlayed = 3;
|
||||
bytes SessionUsageTableEntry = 4;
|
||||
}
|
||||
CENC CencId = 1;
|
||||
WebM WebmId = 2;
|
||||
ExistingLicense License = 3;
|
||||
}
|
||||
enum RequestType {
|
||||
DUMMY_REQ_TYPE = 0; // dummy, added to satisfy proto3
|
||||
NEW = 1;
|
||||
RENEWAL = 2;
|
||||
RELEASE = 3;
|
||||
}
|
||||
ClientIdentification ClientId = 1;
|
||||
ContentIdentification ContentId = 2;
|
||||
RequestType Type = 3;
|
||||
uint32 RequestTime = 4;
|
||||
bytes KeyControlNonceDeprecated = 5;
|
||||
ProtocolVersion ProtocolVersion = 6; // lacking symbols for this
|
||||
uint32 KeyControlNonce = 7;
|
||||
EncryptedClientIdentification EncryptedClientId = 8;
|
||||
}
|
||||
|
||||
message ProvisionedDeviceInfo {
|
||||
enum WvSecurityLevel {
|
||||
LEVEL_UNSPECIFIED = 0;
|
||||
LEVEL_1 = 1;
|
||||
LEVEL_2 = 2;
|
||||
LEVEL_3 = 3;
|
||||
}
|
||||
uint32 SystemId = 1;
|
||||
string Soc = 2;
|
||||
string Manufacturer = 3;
|
||||
string Model = 4;
|
||||
string DeviceType = 5;
|
||||
uint32 ModelYear = 6;
|
||||
WvSecurityLevel SecurityLevel = 7;
|
||||
uint32 TestDevice = 8; // bool?
|
||||
}
|
||||
|
||||
|
||||
// todo: fill
|
||||
message ProvisioningOptions {
|
||||
}
|
||||
|
||||
// todo: fill
|
||||
message ProvisioningRequest {
|
||||
}
|
||||
|
||||
// todo: fill
|
||||
message ProvisioningResponse {
|
||||
}
|
||||
|
||||
message RemoteAttestation {
|
||||
EncryptedClientIdentification Certificate = 1;
|
||||
string Salt = 2;
|
||||
string Signature = 3;
|
||||
}
|
||||
|
||||
// todo: fill
|
||||
message SessionInit {
|
||||
}
|
||||
|
||||
// todo: fill
|
||||
message SessionState {
|
||||
}
|
||||
|
||||
// todo: fill
|
||||
message SignedCertificateStatusList {
|
||||
}
|
||||
|
||||
message SignedDeviceCertificate {
|
||||
|
||||
//bytes DeviceCertificate = 1; // again, they use a buffer where it's supposed to be a message, so we'll replace it with what it really is:
|
||||
DeviceCertificate _DeviceCertificate = 1; // how should we deal with duped names? will have to look at proto docs later
|
||||
bytes Signature = 2;
|
||||
SignedDeviceCertificate Signer = 3;
|
||||
}
|
||||
|
||||
|
||||
// todo: fill
|
||||
message SignedProvisioningMessage {
|
||||
}
|
||||
|
||||
// the root of all messages, from either server or client
|
||||
message SignedMessage {
|
||||
enum MessageType {
|
||||
DUMMY_MSG_TYPE = 0; // dummy, added to satisfy proto3
|
||||
LICENSE_REQUEST = 1;
|
||||
LICENSE = 2;
|
||||
ERROR_RESPONSE = 3;
|
||||
SERVICE_CERTIFICATE_REQUEST = 4;
|
||||
SERVICE_CERTIFICATE = 5;
|
||||
}
|
||||
MessageType Type = 1; // has in incorrect overlap with License_KeyContainer_SecurityLevel
|
||||
bytes Msg = 2; // this has to be casted dynamically, to LicenseRequest, License or LicenseError (? unconfirmed), for Request, no other fields but Type need to be present
|
||||
// for SERVICE_CERTIFICATE, only Type and Msg are present, and it's just a DeviceCertificate with CertificateType set to SERVICE
|
||||
bytes Signature = 3; // might be different type of signatures (ex. RSA vs AES CMAC(??), unconfirmed for now)
|
||||
bytes SessionKey = 4; // often RSA wrapped for licenses
|
||||
RemoteAttestation RemoteAttestation = 5;
|
||||
}
|
||||
|
||||
|
||||
|
||||
// This message is copied from google's docs, not reversed:
|
||||
message WidevineCencHeader {
|
||||
enum Algorithm {
|
||||
UNENCRYPTED = 0;
|
||||
AESCTR = 1;
|
||||
};
|
||||
Algorithm algorithm = 1;
|
||||
repeated bytes key_id = 2;
|
||||
|
||||
// Content provider name.
|
||||
string provider = 3;
|
||||
|
||||
// A content identifier, specified by content provider.
|
||||
bytes content_id = 4;
|
||||
|
||||
// Track type. Acceptable values are SD, HD and AUDIO. Used to
|
||||
// differentiate content keys used by an asset.
|
||||
string track_type_deprecated = 5;
|
||||
|
||||
// The name of a registered policy to be used for this asset.
|
||||
string policy = 6;
|
||||
|
||||
// Crypto period index, for media using key rotation.
|
||||
uint32 crypto_period_index = 7;
|
||||
|
||||
// Optional protected context for group content. The grouped_license is a
|
||||
// serialized SignedMessage.
|
||||
bytes grouped_license = 8;
|
||||
|
||||
// Protection scheme identifying the encryption algorithm.
|
||||
// Represented as one of the following 4CC values:
|
||||
// 'cenc' (AESCTR), 'cbc1' (AESCBC),
|
||||
// 'cens' (AESCTR subsample), 'cbcs' (AESCBC subsample).
|
||||
uint32 protection_scheme = 9;
|
||||
|
||||
// Optional. For media using key rotation, this represents the duration
|
||||
// of each crypto period in seconds.
|
||||
uint32 crypto_period_seconds = 10;
|
||||
}
|
||||
|
||||
|
||||
|
||||
|
||||
// from here on, it's just for testing, these messages don't exist in the binaries, I'm adding them to avoid detecting type programmatically
|
||||
message SignedLicenseRequest {
|
||||
enum MessageType {
|
||||
DUMMY_MSG_TYPE = 0; // dummy, added to satisfy proto3
|
||||
LICENSE_REQUEST = 1;
|
||||
LICENSE = 2;
|
||||
ERROR_RESPONSE = 3;
|
||||
SERVICE_CERTIFICATE_REQUEST = 4;
|
||||
SERVICE_CERTIFICATE = 5;
|
||||
}
|
||||
MessageType Type = 1; // has in incorrect overlap with License_KeyContainer_SecurityLevel
|
||||
LicenseRequest Msg = 2; // this has to be casted dynamically, to LicenseRequest, License or LicenseError (? unconfirmed), for Request, no other fields but Type need to be present
|
||||
// for SERVICE_CERTIFICATE, only Type and Msg are present, and it's just a DeviceCertificate with CertificateType set to SERVICE
|
||||
bytes Signature = 3; // might be different type of signatures (ex. RSA vs AES CMAC(??), unconfirmed for now)
|
||||
bytes SessionKey = 4; // often RSA wrapped for licenses
|
||||
RemoteAttestation RemoteAttestation = 5;
|
||||
}
|
||||
|
||||
message SignedLicense {
|
||||
enum MessageType {
|
||||
DUMMY_MSG_TYPE = 0; // dummy, added to satisfy proto3
|
||||
LICENSE_REQUEST = 1;
|
||||
LICENSE = 2;
|
||||
ERROR_RESPONSE = 3;
|
||||
SERVICE_CERTIFICATE_REQUEST = 4;
|
||||
SERVICE_CERTIFICATE = 5;
|
||||
}
|
||||
MessageType Type = 1; // has in incorrect overlap with License_KeyContainer_SecurityLevel
|
||||
License Msg = 2; // this has to be casted dynamically, to LicenseRequest, License or LicenseError (? unconfirmed), for Request, no other fields but Type need to be present
|
||||
// for SERVICE_CERTIFICATE, only Type and Msg are present, and it's just a DeviceCertificate with CertificateType set to SERVICE
|
||||
bytes Signature = 3; // might be different type of signatures (ex. RSA vs AES CMAC(??), unconfirmed for now)
|
||||
bytes SessionKey = 4; // often RSA wrapped for licenses
|
||||
RemoteAttestation RemoteAttestation = 5;
|
||||
}
|
2686
pywidevine/cdm/formats/wv_proto3_pb2.py
Normal file
2686
pywidevine/cdm/formats/wv_proto3_pb2.py
Normal file
File diff suppressed because one or more lines are too long
19
pywidevine/cdm/key.py
Normal file
19
pywidevine/cdm/key.py
Normal file
@ -0,0 +1,19 @@
|
||||
# uncompyle6 version 3.3.2
|
||||
# Python bytecode 3.6 (3379)
|
||||
# Decompiled from: Python 3.7.3 (v3.7.3:ef4ec6ed12, Mar 25 2019, 22:22:05) [MSC v.1916 64 bit (AMD64)]
|
||||
# Embedded file name: pywidevine\cdm\key.py
|
||||
import binascii
|
||||
|
||||
class Key:
|
||||
|
||||
def __init__(self, kid, type, key, permissions=[]):
|
||||
self.kid = kid
|
||||
self.type = type
|
||||
self.key = key
|
||||
self.permissions = permissions
|
||||
|
||||
def __repr__(self):
|
||||
if self.type == 'OPERATOR_SESSION':
|
||||
return ('key(kid={}, type={}, key={}, permissions={})').format(self.kid, self.type, binascii.hexlify(self.key), self.permissions)
|
||||
else:
|
||||
return ('key(kid={}, type={}, key={})').format(self.kid, self.type, binascii.hexlify(self.key))
|
23
pywidevine/cdm/session.py
Normal file
23
pywidevine/cdm/session.py
Normal file
@ -0,0 +1,23 @@
|
||||
# uncompyle6 version 3.3.2
|
||||
# Python bytecode 3.6 (3379)
|
||||
# Decompiled from: Python 3.7.3 (v3.7.3:ef4ec6ed12, Mar 25 2019, 22:22:05) [MSC v.1916 64 bit (AMD64)]
|
||||
# Embedded file name: pywidevine\cdm\session.py
|
||||
|
||||
|
||||
class Session:
|
||||
|
||||
def __init__(self, session_id, init_data, device_config, offline):
|
||||
self.session_id = session_id
|
||||
self.init_data = init_data
|
||||
self.offline = offline
|
||||
self.device_config = device_config
|
||||
self.device_key = None
|
||||
self.session_key = None
|
||||
self.derived_keys = {'enc':None,
|
||||
'auth_1':None,
|
||||
'auth_2':None}
|
||||
self.license_request = None
|
||||
self.license = None
|
||||
self.service_certificate = None
|
||||
self.privacy_mode = False
|
||||
self.keys = []
|
79
pywidevine/decrypt/wvdecryptcustom.py
Normal file
79
pywidevine/decrypt/wvdecryptcustom.py
Normal file
@ -0,0 +1,79 @@
|
||||
import logging
|
||||
import subprocess
|
||||
import re
|
||||
import base64
|
||||
|
||||
from pywidevine.cdm import cdm, deviceconfig
|
||||
|
||||
|
||||
class WvDecrypt(object):
|
||||
WV_SYSTEM_ID = [
|
||||
237,
|
||||
239,
|
||||
139,
|
||||
169,
|
||||
121,
|
||||
214,
|
||||
74,
|
||||
206,
|
||||
163,
|
||||
200,
|
||||
39,
|
||||
220,
|
||||
213,
|
||||
29,
|
||||
33,
|
||||
237,
|
||||
]
|
||||
|
||||
def __init__(self, init_data_b64, cert_data_b64, device):
|
||||
self.init_data_b64 = init_data_b64
|
||||
self.cert_data_b64 = cert_data_b64
|
||||
self.device = device
|
||||
|
||||
self.cdm = cdm.Cdm()
|
||||
|
||||
def check_pssh(pssh_b64):
|
||||
pssh = base64.b64decode(pssh_b64)
|
||||
if not pssh[12:28] == bytes(self.WV_SYSTEM_ID):
|
||||
new_pssh = bytearray([0, 0, 0])
|
||||
new_pssh.append(32 + len(pssh))
|
||||
new_pssh[4:] = bytearray(b"pssh")
|
||||
new_pssh[8:] = [0, 0, 0, 0]
|
||||
new_pssh[13:] = self.WV_SYSTEM_ID
|
||||
new_pssh[29:] = [0, 0, 0, 0]
|
||||
new_pssh[31] = len(pssh)
|
||||
new_pssh[32:] = pssh
|
||||
return base64.b64encode(new_pssh)
|
||||
else:
|
||||
return pssh_b64
|
||||
|
||||
self.session = self.cdm.open_session(
|
||||
check_pssh(self.init_data_b64), deviceconfig.DeviceConfig(self.device)
|
||||
)
|
||||
|
||||
if self.cert_data_b64:
|
||||
self.cdm.set_service_certificate(self.session, self.cert_data_b64)
|
||||
|
||||
def log_message(self, msg):
|
||||
return "{}".format(msg)
|
||||
|
||||
def start_process(self):
|
||||
keyswvdecrypt = []
|
||||
try:
|
||||
for key in self.cdm.get_keys(self.session):
|
||||
if key.type == "CONTENT":
|
||||
keyswvdecrypt.append(
|
||||
self.log_message("{}:{}".format(key.kid.hex(), key.key.hex()))
|
||||
)
|
||||
except Exception:
|
||||
return False, keyswvdecrypt
|
||||
|
||||
return True, keyswvdecrypt
|
||||
def get_challenge(self):
|
||||
return self.cdm.get_license_request(self.session)
|
||||
|
||||
def update_license(self, license_b64):
|
||||
self.cdm.provide_license(self.session, license_b64)
|
||||
return True
|
||||
|
25
requirements.txt
Normal file
25
requirements.txt
Normal file
@ -0,0 +1,25 @@
|
||||
bs4
|
||||
colorama
|
||||
wcwidth
|
||||
requests
|
||||
ffmpy
|
||||
inquirer
|
||||
pycountry
|
||||
tqdm
|
||||
future
|
||||
cssutils
|
||||
pycaption
|
||||
pymediainfo
|
||||
isodate
|
||||
untangle
|
||||
tldextract
|
||||
unidecode
|
||||
yarl
|
||||
natsort
|
||||
titlecase==2.0.0
|
||||
prettytable
|
||||
termcolor
|
||||
pproxy
|
||||
pysrt
|
||||
protobuf
|
||||
pycryptodomex
|
969
services/netflix.py
Normal file
969
services/netflix.py
Normal file
@ -0,0 +1,969 @@
|
||||
import argparse, configparser, glob, json, logging, os, re, shutil, subprocess, sys, time, ffmpy, pycountry, requests, tqdm
|
||||
from bs4 import BeautifulSoup
|
||||
from threading import Thread
|
||||
from urllib.parse import urlsplit
|
||||
import utils.modules.pycaption as pycaption
|
||||
from http.cookiejar import MozillaCookieJar
|
||||
from configs.config import tool
|
||||
from helpers.aria2 import aria2
|
||||
from helpers.dfxp_to_srt import dfxp_to_srt
|
||||
from helpers.keyloader import keysaver
|
||||
from helpers.Muxer import Muxer
|
||||
from helpers.Parsers.Netflix import get_keys
|
||||
from helpers.Parsers.Netflix.get_manifest import get_manifest
|
||||
from helpers.ripprocess import EpisodesNumbersHandler, ripprocess
|
||||
from helpers.vpn import connect
|
||||
from pywidevine.cdm import cdm, deviceconfig
|
||||
from pywidevine.decrypt.wvdecryptcustom import WvDecrypt
|
||||
|
||||
class netflix:
|
||||
def __init__(self, args, commands):
|
||||
self.logger = logging.getLogger(__name__)
|
||||
self.args = args
|
||||
self.tool = tool()
|
||||
self.config = self.tool.config("NETFLIX")
|
||||
self.bin = self.tool.bin()
|
||||
self.ripprocess = ripprocess()
|
||||
self.EpisodesNumbersHandler = EpisodesNumbersHandler()
|
||||
self.commands = commands
|
||||
self.keysaver = keysaver(keys_file=self.config["keys_file"])
|
||||
self.logdata = {} # to save title data for debug or use later
|
||||
self.source_tag = "NF"
|
||||
self.dfxp_to_srt = dfxp_to_srt()
|
||||
self.aria2 = aria2()
|
||||
self.video_settings = self.tool.video_settings()
|
||||
self.checkList = list()
|
||||
|
||||
def DumpStoredData(self, nfid):
|
||||
if nfid:
|
||||
return
|
||||
name = "NETFLIX-{}.json".format(nfid)
|
||||
nfid_json = os.path.join(self.config["jsonpath"], name)
|
||||
with open(nfid_json, "w", encoding="utf-8") as file_:
|
||||
file_.write(json.dumps(self.logdata, indent=4))
|
||||
file_.flush()
|
||||
file_.close()
|
||||
|
||||
def store(self, data, keyword):
|
||||
self.logdata.update({keyword: data})
|
||||
return
|
||||
|
||||
def get_build(self, cookies): #
|
||||
BUILD_REGEX = r'"BUILD_IDENTIFIER":"([a-z0-9]+)"'
|
||||
|
||||
session = requests.Session()
|
||||
session.headers = {
|
||||
"Connection": "keep-alive",
|
||||
"Upgrade-Insecure-Requests": "1",
|
||||
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.135 Safari/537.36",
|
||||
"Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9",
|
||||
"Sec-Fetch-Site": "none",
|
||||
"Sec-Fetch-Mode": "navigate",
|
||||
"Sec-Fetch-Dest": "document",
|
||||
"Accept-Language": "en,en-US;q=0.9",
|
||||
}
|
||||
|
||||
r = session.get("https://www.netflix.com/browse", cookies=cookies)
|
||||
|
||||
if not re.search(BUILD_REGEX, r.text):
|
||||
print(
|
||||
"cannot get BUILD_IDENTIFIER from the cookies you saved from the browser..."
|
||||
)
|
||||
sys.exit()
|
||||
|
||||
return re.search(BUILD_REGEX, r.text).group(1)
|
||||
|
||||
def save(self, cookies, build): #
|
||||
cookie_data = {}
|
||||
for name, value in cookies.items():
|
||||
cookie_data[name] = [value, 0]
|
||||
logindata = {"BUILD_IDENTIFIER": build, "cookies": cookie_data}
|
||||
with open(self.config["cookies_file"], "w", encoding="utf8") as f:
|
||||
f.write(json.dumps(logindata, indent=4))
|
||||
f.close()
|
||||
os.remove(self.config["cookies_txt"])
|
||||
|
||||
def read_userdata(self): #
|
||||
cookies = None
|
||||
build = None
|
||||
|
||||
if not os.path.isfile(self.config["cookies_file"]):
|
||||
try:
|
||||
cj = MozillaCookieJar(self.config["cookies_txt"])
|
||||
cj.load()
|
||||
except Exception:
|
||||
print("invalid netscape format cookies file")
|
||||
sys.exit()
|
||||
|
||||
cookies = dict()
|
||||
|
||||
for cookie in cj:
|
||||
cookies[cookie.name] = cookie.value
|
||||
|
||||
build = self.get_build(cookies)
|
||||
self.save(cookies, build)
|
||||
|
||||
with open(self.config["cookies_file"], "rb") as f:
|
||||
content = f.read().decode("utf-8")
|
||||
|
||||
if "NetflixId" not in content:
|
||||
self.logger.warning("(Some) cookies expired, renew...")
|
||||
return cookies, build
|
||||
|
||||
jso = json.loads(content)
|
||||
build = jso["BUILD_IDENTIFIER"]
|
||||
cookies = jso["cookies"]
|
||||
for cookie in cookies:
|
||||
cookie_data = cookies[cookie]
|
||||
value = cookie_data[0]
|
||||
if cookie != "flwssn":
|
||||
cookies[cookie] = value
|
||||
if cookies.get("flwssn"):
|
||||
del cookies["flwssn"]
|
||||
|
||||
return cookies, build
|
||||
|
||||
def shakti_api(self, nfid): #
|
||||
url = f"https://www.netflix.com/api/shakti/{self.build}/metadata"
|
||||
headers = {
|
||||
"Accept": "*/*",
|
||||
"Accept-Encoding": "gzip, deflate, br",
|
||||
"Accept-Language": "es,ca;q=0.9,en;q=0.8",
|
||||
"Cache-Control": "no-cache",
|
||||
"Connection": "keep-alive",
|
||||
"Host": "www.netflix.com",
|
||||
"Pragma": "no-cache",
|
||||
"Sec-Fetch-Mode": "cors",
|
||||
"Sec-Fetch-Site": "same-origin",
|
||||
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.87 Safari/537.36",
|
||||
"X-Netflix.browserName": "Chrome",
|
||||
"X-Netflix.browserVersion": "79",
|
||||
"X-Netflix.clientType": "akira",
|
||||
"X-Netflix.esnPrefix": "NFCDCH-02-",
|
||||
"X-Netflix.osFullName": "Windows 10",
|
||||
"X-Netflix.osName": "Windows",
|
||||
"X-Netflix.osVersion": "10.0",
|
||||
"X-Netflix.playerThroughput": "1706",
|
||||
"X-Netflix.uiVersion": self.build,
|
||||
}
|
||||
|
||||
params = {
|
||||
"movieid": nfid,
|
||||
"drmSystem": "widevine",
|
||||
"isWatchlistEnabled": "false",
|
||||
"isShortformEnabled": "false",
|
||||
"isVolatileBillboardsEnabled": "false",
|
||||
"languages": self.config["metada_language"],
|
||||
}
|
||||
|
||||
while True:
|
||||
resp = requests.get(
|
||||
url=url, headers=headers, params=params, cookies=self.cookies
|
||||
)
|
||||
|
||||
if resp.status_code == 401:
|
||||
self.logger.warning("401 Unauthorized, cookies is invalid.")
|
||||
elif resp.text.strip() == "":
|
||||
self.logger.error("title is not available in your Netflix region.")
|
||||
exit(-1)
|
||||
|
||||
try:
|
||||
t = resp.json()["video"]["type"]
|
||||
return resp.json()
|
||||
except Exception:
|
||||
os.remove(self.config["cookies_file"])
|
||||
self.logger.warning(
|
||||
"Error getting metadata: Cookies expired\nplease fetch new cookies.txt"
|
||||
)
|
||||
exit(-1)
|
||||
|
||||
def Search(self, query):
|
||||
session = requests.Session()
|
||||
session.headers = {
|
||||
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:65.0) Gecko/20100101 Firefox/65.0"
|
||||
}
|
||||
# select profile
|
||||
'''profiles = list()
|
||||
resp = session.get("https://www.netflix.com/browse", cookies=self.cookies)
|
||||
bs = BeautifulSoup(resp.text, "html.parser")
|
||||
profiles_ = bs.find_all("a", {"class": "profile-link"})
|
||||
|
||||
for profile in profiles_:
|
||||
profiles.append(
|
||||
(profile.find("span", {"class": "profile-name"}).text, profile["href"])
|
||||
)
|
||||
|
||||
if profiles == []:
|
||||
self.logger.warning(
|
||||
"Cannot select user profile, maybe cookies is invalid or the account has no profies."
|
||||
)
|
||||
return None
|
||||
|
||||
# request page with the profile
|
||||
session.get("https://www.netflix.com" + profiles[0][1], cookies=self.cookies)'''
|
||||
# search for title
|
||||
resp = session.get(
|
||||
"https://www.netflix.com/search?q=" + query, cookies=self.cookies
|
||||
)
|
||||
|
||||
if not resp.status_code == 200:
|
||||
self.logger.error("error searching, maybe invalid cookies.")
|
||||
return None
|
||||
|
||||
# add all search itmes
|
||||
itmes = []
|
||||
bs = BeautifulSoup(resp.text, "html.parser")
|
||||
titles = bs.find_all("div", {"class": "title-card-container"})
|
||||
|
||||
for title in titles:
|
||||
itmes.append(
|
||||
{
|
||||
"name": title.find(
|
||||
"div", {"class": "fallback-text-container"}
|
||||
).text,
|
||||
"id": title.find("a", href=True)["href"]
|
||||
.split("?")[0]
|
||||
.split("/")[2],
|
||||
}
|
||||
)
|
||||
|
||||
if itmes == []:
|
||||
self.logger.error(
|
||||
f'Your search for "{query}" did not have any matches, try different keywords.'
|
||||
)
|
||||
return None
|
||||
|
||||
# usually first item is the right items
|
||||
self.logger.info("Founded: {} items".format(str(len(itmes))))
|
||||
self.logger.info("id: {} - name: {}".format(itmes[0]["id"], itmes[0]["name"]))
|
||||
isRightItem = input("if this what you looking: Enter yes or no: ").strip()
|
||||
if isRightItem.lower() == "y" or isRightItem.lower() == "yes":
|
||||
return int(itmes[0]["id"])
|
||||
|
||||
# first item is wrong
|
||||
|
||||
self.logger.info("The available items is: ")
|
||||
for idx, item in enumerate(itmes, start=1):
|
||||
self.logger.info(
|
||||
"[{}] - id: {} - name: {}".format(idx, item["id"], item["name"])
|
||||
)
|
||||
|
||||
item_number = input("\nChoose item number: ").strip()
|
||||
if item_number.isdigit():
|
||||
item = itmes[item_number - 1]["id"]
|
||||
return int(item)
|
||||
|
||||
return None
|
||||
|
||||
def get_nfid(self, content_id): #
|
||||
if content_id.isdigit():
|
||||
return int(content_id)
|
||||
|
||||
validUrl = re.compile(
|
||||
r'https?://(?:www\.)?netflix\.com/(\w+)?/?(?:title|watch|browse?jbv=)/?(?P<id>\d+)'
|
||||
)
|
||||
|
||||
nfID = validUrl.match(content_id)
|
||||
|
||||
if nfID:
|
||||
return int(nfID.group('id'))
|
||||
|
||||
else:
|
||||
nfID = re.search(r'[0-9]{8}$', content_id)
|
||||
|
||||
if nfID:
|
||||
return int(nfID[0])
|
||||
|
||||
else:
|
||||
self.logger.error('Detection of NF ID from the given url: Failed.')
|
||||
sys.exit()
|
||||
|
||||
def CleanSubtitleVTT(self, file_content):
|
||||
file_content = re.sub(r"{.*?}", "", file_content)
|
||||
file_content = re.sub(
|
||||
r"(.*\bposition:50.00%.*\bline:10.00%)\s*(.*)",
|
||||
r"\1\n{\\an8}\2",
|
||||
file_content,
|
||||
)
|
||||
|
||||
file_content = re.sub(r"‏", "\u202B", file_content)
|
||||
file_content = re.sub(r"‎", "\u202A", file_content)
|
||||
file_content = re.sub(r"&", "&", file_content)
|
||||
file_content = re.sub(r"([\d]+)\.([\d]+)", r"\1,\2", file_content)
|
||||
file_content = re.sub(r"WEBVTT\n\n", "", file_content)
|
||||
file_content = re.sub(r"NOTE.*\n", "", file_content)
|
||||
file_content = re.sub(r"\n\s+\n", "", file_content)
|
||||
file_content = re.sub(r" position:.+%", "", file_content)
|
||||
file_content = re.sub(r"</?c.+?>", "", file_content)
|
||||
return file_content
|
||||
|
||||
def downloadFile2(self, url, file_name):
|
||||
with open(file_name, "wb") as f:
|
||||
response = requests.get(url, stream=True)
|
||||
#response.encoding = 'UTF-8'
|
||||
f.write(response.content)
|
||||
|
||||
return
|
||||
|
||||
def downloadFile(self, url, file_name, silent=False):
|
||||
self.logger.info("\n" + file_name)
|
||||
|
||||
if self.args.noaria2c:
|
||||
self.ripprocess.tqdm_downloader(url, file_name)
|
||||
return
|
||||
|
||||
options = self.aria2.aria2Options(
|
||||
allow_overwrite=True,
|
||||
quiet=silent,
|
||||
file_allocation=None,
|
||||
auto_file_renaming=False,
|
||||
async_dns="skip",
|
||||
retry_wait=5,
|
||||
summary_interval=0,
|
||||
enable_color=True,
|
||||
connection=16,
|
||||
concurrent_downloads=16,
|
||||
split=16,
|
||||
uri_selector="inorder",
|
||||
console_log_level="warn",
|
||||
download_result="hide",
|
||||
extra_commands=[]
|
||||
if self.args.no_download_proxy
|
||||
else self.commands["aria2c_extra_commands"],
|
||||
)
|
||||
|
||||
self.aria2.aria2DownloadUrl(
|
||||
url=url, output=file_name, options=options, debug=False, moded=False
|
||||
)
|
||||
|
||||
return
|
||||
|
||||
def GetKeys(self, IDNet, profilename):
|
||||
video_keys = []
|
||||
available_profiles = [
|
||||
"High KEYS",
|
||||
"HEVC KEYS",
|
||||
"HDR-10 KEYS",
|
||||
"Main KEYS"
|
||||
]
|
||||
|
||||
if not profilename in available_profiles:
|
||||
self.logger.error("Error: Unknown profile: {}".format(profilename))
|
||||
sys.exit(1)
|
||||
|
||||
try:
|
||||
video_keys = get_keys.GettingKEYS_Netflixv2(IDNet, profilename)
|
||||
if not video_keys == []:
|
||||
video_keys = list(set(video_keys))
|
||||
video_keys = [profilename] + video_keys
|
||||
self.logger.info("Done!")
|
||||
else:
|
||||
self.logger.error("Error!")
|
||||
except Exception as e:
|
||||
self.logger.error("Error!: {}".format(e))
|
||||
|
||||
return video_keys
|
||||
|
||||
def GetAudioCocedName(self, audioList):
|
||||
codecs = {
|
||||
"ddplus-atmos-dash": "DDP5.1.Atmos",
|
||||
"ddplus-5.1hq-dash": "DDP5.1",
|
||||
"ddplus-5.1-dash": "DDP5.1",
|
||||
"dd-5.1-dash": "DD5.1",
|
||||
"ddplus-2.0-dash": "DDP2.0",
|
||||
"heaac-5.1hq-dash": "AAC5.1",
|
||||
"heaac-5.1-dash": "AAC5.1",
|
||||
"heaac-2-dash": "AAC2.0",
|
||||
"heaac-2hq-dash": "AAC2.0",
|
||||
"playready-oggvorbis-2-dash": "OGG2.0",
|
||||
}
|
||||
|
||||
profiles = [x["Profile"] for x in audioList]
|
||||
if not profiles == []:
|
||||
for profile in profiles:
|
||||
try:
|
||||
return codecs[profile]
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
return "DDP5.1"
|
||||
|
||||
def RemuxVideo(self, outputVideoTemp, outputVideo, Name):
|
||||
self.logger.info("\nRemuxing video...")
|
||||
ff = ffmpy.FFmpeg(
|
||||
executable=self.bin["ffmpeg"],
|
||||
inputs={outputVideoTemp: None},
|
||||
outputs={outputVideo: "-c copy"},
|
||||
global_options="-y -hide_banner -loglevel warning",
|
||||
)
|
||||
|
||||
ff.run()
|
||||
time.sleep(50.0 / 1000.0)
|
||||
os.remove(outputVideoTemp)
|
||||
self.logger.info("Done!")
|
||||
return True
|
||||
|
||||
def DecryptVideo_withtxtkeys(self, inputVideo, outputVideoTemp, outputVideo, kid, Name):
|
||||
with open(self.config["keys_file"] + "NETFLIX.keys", "r") as keys_file_netflix:
|
||||
keys_video = keys_file_netflix.readlines()
|
||||
|
||||
keys_video = [x.strip() for x in keys_video if ":" in x]
|
||||
if not keys_video == []:
|
||||
for key in keys_video:
|
||||
if key[0:32] == kid:
|
||||
self.logger.info("\nDecrypting video...")
|
||||
self.logger.info("Using KEY: " + key)
|
||||
subprocess.call(
|
||||
[
|
||||
self.bin["mp4decrypt"],
|
||||
"--show-progress",
|
||||
"--key",
|
||||
key,
|
||||
inputVideo,
|
||||
outputVideoTemp,
|
||||
]
|
||||
)
|
||||
self.RemuxVideo(outputVideoTemp, outputVideo, Name)
|
||||
return True
|
||||
self.logger.warning("\nKEY for " + inputVideo + " is not in txt.")
|
||||
return False
|
||||
|
||||
def DecryptVideo(self, inputVideo, outputVideoTemp, outputVideo, IDNet, Name, Profile, silent=False):
|
||||
KID = self.keysaver.generate_kid(inputVideo)
|
||||
KEYS = self.keysaver.get_key_by_kid(KID)
|
||||
|
||||
if KEYS == []:
|
||||
self.logger.info("\nKEY for {} not saved before.".format(inputVideo))
|
||||
self.logger.info("\nGetting Video KEYS...")
|
||||
|
||||
if self.args.video_high:
|
||||
KEYS = self.GetKeys(IDNet, "High KEYS")
|
||||
else:
|
||||
if self.args.hevc:
|
||||
KEYS = self.GetKeys(IDNet, "HEVC KEYS")
|
||||
else:
|
||||
if self.args.hdr:
|
||||
KEYS = self.GetKeys(IDNet, "HDR-10 KEYS")
|
||||
else:
|
||||
if "playready-h264hpl" in Profile:
|
||||
KEYS = self.GetKeys(IDNet, "High KEYS")
|
||||
else:
|
||||
KEYS = self.GetKeys(IDNet, "Main KEYS")
|
||||
# ~
|
||||
if KEYS == []:
|
||||
return False
|
||||
|
||||
KEYS = self.keysaver.dump_keys(
|
||||
keys=[key for key in KEYS if ":" in key], pssh=None, name=Name
|
||||
)
|
||||
|
||||
only1key = [x for x in KEYS if x["KID"] == KID]
|
||||
if not only1key == []:
|
||||
KEYS = only1key
|
||||
|
||||
self.ripprocess.mp4_decrypt(
|
||||
encrypted=inputVideo,
|
||||
decrypted=outputVideoTemp,
|
||||
keys=KEYS,
|
||||
moded_decrypter=True,
|
||||
no_kid=False,
|
||||
silent=silent,
|
||||
)
|
||||
|
||||
if not "NETFLIX".lower() in list(
|
||||
map(lambda x: x.lower(), self.video_settings["skip_video_demux"])
|
||||
):
|
||||
self.ripprocess.DemuxVideo(
|
||||
outputVideoTemp=outputVideoTemp,
|
||||
outputVideo=outputVideo,
|
||||
ffmpeg=True,
|
||||
mp4box=False,
|
||||
)
|
||||
else:
|
||||
os.rename(outputVideoTemp, outputVideo)
|
||||
|
||||
return True
|
||||
|
||||
def SubtitleThreader(self, subtitlesList, name):
|
||||
for z in subtitlesList:
|
||||
if str(dict(z)["isForced"]) == "YES":
|
||||
langAbbrev = "forced-" + str(dict(z)["langAbbrev"])
|
||||
elif str(dict(z)["isForced"]) == "SDH":
|
||||
langAbbrev = "sdh-" + str(dict(z)["langAbbrev"])
|
||||
else:
|
||||
langAbbrev = str(dict(z)["langAbbrev"])
|
||||
|
||||
ext = "dfxp" if str(dict(z)["Profile"]) == "dfxp-ls-sdh" else "vtt"
|
||||
inputSubtitleDFXP = f"{name} {langAbbrev}.{ext}"
|
||||
inputSubtitleSrt = f"{name} {langAbbrev}.srt"
|
||||
if os.path.isfile(inputSubtitleDFXP) or os.path.isfile(inputSubtitleSrt):
|
||||
pass
|
||||
else:
|
||||
self.downloadFile2(str(dict(z)["Url"]), inputSubtitleDFXP)
|
||||
|
||||
dfxp = glob.glob(name + "*.dfxp")
|
||||
vtt = glob.glob(name + "*.vtt")
|
||||
if not dfxp == []:
|
||||
for f in dfxp:
|
||||
self.dfxp_to_srt.convert(f, f.replace(".dfxp", ".srt"))
|
||||
os.remove(f)
|
||||
|
||||
if not vtt == []:
|
||||
for f in vtt:
|
||||
with open(f, "r+", encoding="utf-8") as x:
|
||||
old = x.read()
|
||||
with open(f.replace(".vtt", ".srt"), "w+", encoding="utf-8") as x:
|
||||
x.write(self.CleanSubtitleVTT(old))
|
||||
os.remove(f)
|
||||
|
||||
def downloadItem(self, item):
|
||||
|
||||
TitleName = item["TitleName"]
|
||||
FolderName = item["FolderName"]
|
||||
|
||||
try:
|
||||
CurrentHeigh = str(item["video"]["Height"])
|
||||
CurrentWidth = str(item["video"]["Width"])
|
||||
except Exception:
|
||||
CurrentHeigh = "None"
|
||||
CurrentWidth = "None"
|
||||
|
||||
if not self.args.nosubs:
|
||||
SubsThread = Thread(
|
||||
target=self.SubtitleThreader,
|
||||
args=(item["subtitle"] + item["forced"], TitleName,),
|
||||
)
|
||||
SubsThread.start()
|
||||
self.logger.info("\nSubtitle Thread download started.")
|
||||
|
||||
if not self.args.novideo:
|
||||
self.logger.info("\nDownloading video...")
|
||||
if self.args.hevc:
|
||||
inputVideo = f"{TitleName} [{CurrentHeigh}p] [HEVC].mp4"
|
||||
outputVideoTemp = (
|
||||
f"{TitleName} [{CurrentHeigh}p] [HEVC]_DecryptTemp.mp4"
|
||||
)
|
||||
inputVideo_demuxed = f"{TitleName} [{CurrentHeigh}p] [HEVC]_Demuxed.mp4"
|
||||
elif self.args.hdr:
|
||||
inputVideo = f"{TitleName} [{CurrentHeigh}p] [HDR].mp4"
|
||||
outputVideoTemp = f"{TitleName} [{CurrentHeigh}p] [HDR]_DecryptTemp.mp4"
|
||||
inputVideo_demuxed = f"{TitleName} [{CurrentHeigh}p] [HDR]_Demuxed.mp4"
|
||||
else:
|
||||
if "playready-h264hpl" in str(
|
||||
item["video"]["Profile"]
|
||||
) or "playready-h264shpl" in str(item["video"]["Profile"]):
|
||||
inputVideo = f"{TitleName} [{CurrentHeigh}p] [HIGH].mp4"
|
||||
outputVideoTemp = (
|
||||
f"{TitleName} [{CurrentHeigh}p] [HIGH]_DecryptTemp.mp4"
|
||||
)
|
||||
inputVideo_demuxed = (
|
||||
f"{TitleName} [{CurrentHeigh}p] [HIGH]_Demuxed.mp4"
|
||||
)
|
||||
else:
|
||||
inputVideo = f"{TitleName} [{CurrentHeigh}p].mp4"
|
||||
outputVideoTemp = f"{TitleName} [{CurrentHeigh}p]_DecryptTemp.mp4"
|
||||
inputVideo_demuxed = f"{TitleName} [{CurrentHeigh}p]_Demuxed.mp4"
|
||||
|
||||
if (
|
||||
os.path.isfile(inputVideo)
|
||||
and not os.path.isfile(inputVideo + ".aria2")
|
||||
or os.path.isfile(inputVideo_demuxed)
|
||||
):
|
||||
self.logger.info(
|
||||
"\n"
|
||||
+ inputVideo
|
||||
+ "\nFile has already been successfully downloaded previously."
|
||||
)
|
||||
else:
|
||||
self.downloadFile(item["video"]["Url"], inputVideo)
|
||||
|
||||
#################################################################################
|
||||
|
||||
if not self.args.noaudio:
|
||||
self.logger.info("\nDownloading audio...")
|
||||
for audio in item["audio"]:
|
||||
langAbbrev = dict(audio)["Language"]
|
||||
inputAudio = f"{TitleName} {langAbbrev}-audio.mp4"
|
||||