forked from forks/microblog.pub
Initial commit for new v2
This commit is contained in:
commit
d528369954
63 changed files with 7961 additions and 0 deletions
4
.flake8
Normal file
4
.flake8
Normal file
|
@ -0,0 +1,4 @@
|
||||||
|
[flake8]
|
||||||
|
max-line-length = 88
|
||||||
|
extend-ignore = E203
|
||||||
|
exclude = alembic/versions
|
4
.gitignore
vendored
Normal file
4
.gitignore
vendored
Normal file
|
@ -0,0 +1,4 @@
|
||||||
|
*.db
|
||||||
|
__pycache__/
|
||||||
|
.mypy_cache/
|
||||||
|
.pytest_cache/
|
661
LICENSE
Normal file
661
LICENSE
Normal file
|
@ -0,0 +1,661 @@
|
||||||
|
GNU AFFERO GENERAL PUBLIC LICENSE
|
||||||
|
Version 3, 19 November 2007
|
||||||
|
|
||||||
|
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
|
||||||
|
Everyone is permitted to copy and distribute verbatim copies
|
||||||
|
of this license document, but changing it is not allowed.
|
||||||
|
|
||||||
|
Preamble
|
||||||
|
|
||||||
|
The GNU Affero General Public License is a free, copyleft license for
|
||||||
|
software and other kinds of works, specifically designed to ensure
|
||||||
|
cooperation with the community in the case of network server software.
|
||||||
|
|
||||||
|
The licenses for most software and other practical works are designed
|
||||||
|
to take away your freedom to share and change the works. By contrast,
|
||||||
|
our General Public Licenses are intended to guarantee your freedom to
|
||||||
|
share and change all versions of a program--to make sure it remains free
|
||||||
|
software for all its users.
|
||||||
|
|
||||||
|
When we speak of free software, we are referring to freedom, not
|
||||||
|
price. Our General Public Licenses are designed to make sure that you
|
||||||
|
have the freedom to distribute copies of free software (and charge for
|
||||||
|
them if you wish), that you receive source code or can get it if you
|
||||||
|
want it, that you can change the software or use pieces of it in new
|
||||||
|
free programs, and that you know you can do these things.
|
||||||
|
|
||||||
|
Developers that use our General Public Licenses protect your rights
|
||||||
|
with two steps: (1) assert copyright on the software, and (2) offer
|
||||||
|
you this License which gives you legal permission to copy, distribute
|
||||||
|
and/or modify the software.
|
||||||
|
|
||||||
|
A secondary benefit of defending all users' freedom is that
|
||||||
|
improvements made in alternate versions of the program, if they
|
||||||
|
receive widespread use, become available for other developers to
|
||||||
|
incorporate. Many developers of free software are heartened and
|
||||||
|
encouraged by the resulting cooperation. However, in the case of
|
||||||
|
software used on network servers, this result may fail to come about.
|
||||||
|
The GNU General Public License permits making a modified version and
|
||||||
|
letting the public access it on a server without ever releasing its
|
||||||
|
source code to the public.
|
||||||
|
|
||||||
|
The GNU Affero General Public License is designed specifically to
|
||||||
|
ensure that, in such cases, the modified source code becomes available
|
||||||
|
to the community. It requires the operator of a network server to
|
||||||
|
provide the source code of the modified version running there to the
|
||||||
|
users of that server. Therefore, public use of a modified version, on
|
||||||
|
a publicly accessible server, gives the public access to the source
|
||||||
|
code of the modified version.
|
||||||
|
|
||||||
|
An older license, called the Affero General Public License and
|
||||||
|
published by Affero, was designed to accomplish similar goals. This is
|
||||||
|
a different license, not a version of the Affero GPL, but Affero has
|
||||||
|
released a new version of the Affero GPL which permits relicensing under
|
||||||
|
this license.
|
||||||
|
|
||||||
|
The precise terms and conditions for copying, distribution and
|
||||||
|
modification follow.
|
||||||
|
|
||||||
|
TERMS AND CONDITIONS
|
||||||
|
|
||||||
|
0. Definitions.
|
||||||
|
|
||||||
|
"This License" refers to version 3 of the GNU Affero General Public License.
|
||||||
|
|
||||||
|
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||||
|
works, such as semiconductor masks.
|
||||||
|
|
||||||
|
"The Program" refers to any copyrightable work licensed under this
|
||||||
|
License. Each licensee is addressed as "you". "Licensees" and
|
||||||
|
"recipients" may be individuals or organizations.
|
||||||
|
|
||||||
|
To "modify" a work means to copy from or adapt all or part of the work
|
||||||
|
in a fashion requiring copyright permission, other than the making of an
|
||||||
|
exact copy. The resulting work is called a "modified version" of the
|
||||||
|
earlier work or a work "based on" the earlier work.
|
||||||
|
|
||||||
|
A "covered work" means either the unmodified Program or a work based
|
||||||
|
on the Program.
|
||||||
|
|
||||||
|
To "propagate" a work means to do anything with it that, without
|
||||||
|
permission, would make you directly or secondarily liable for
|
||||||
|
infringement under applicable copyright law, except executing it on a
|
||||||
|
computer or modifying a private copy. Propagation includes copying,
|
||||||
|
distribution (with or without modification), making available to the
|
||||||
|
public, and in some countries other activities as well.
|
||||||
|
|
||||||
|
To "convey" a work means any kind of propagation that enables other
|
||||||
|
parties to make or receive copies. Mere interaction with a user through
|
||||||
|
a computer network, with no transfer of a copy, is not conveying.
|
||||||
|
|
||||||
|
An interactive user interface displays "Appropriate Legal Notices"
|
||||||
|
to the extent that it includes a convenient and prominently visible
|
||||||
|
feature that (1) displays an appropriate copyright notice, and (2)
|
||||||
|
tells the user that there is no warranty for the work (except to the
|
||||||
|
extent that warranties are provided), that licensees may convey the
|
||||||
|
work under this License, and how to view a copy of this License. If
|
||||||
|
the interface presents a list of user commands or options, such as a
|
||||||
|
menu, a prominent item in the list meets this criterion.
|
||||||
|
|
||||||
|
1. Source Code.
|
||||||
|
|
||||||
|
The "source code" for a work means the preferred form of the work
|
||||||
|
for making modifications to it. "Object code" means any non-source
|
||||||
|
form of a work.
|
||||||
|
|
||||||
|
A "Standard Interface" means an interface that either is an official
|
||||||
|
standard defined by a recognized standards body, or, in the case of
|
||||||
|
interfaces specified for a particular programming language, one that
|
||||||
|
is widely used among developers working in that language.
|
||||||
|
|
||||||
|
The "System Libraries" of an executable work include anything, other
|
||||||
|
than the work as a whole, that (a) is included in the normal form of
|
||||||
|
packaging a Major Component, but which is not part of that Major
|
||||||
|
Component, and (b) serves only to enable use of the work with that
|
||||||
|
Major Component, or to implement a Standard Interface for which an
|
||||||
|
implementation is available to the public in source code form. A
|
||||||
|
"Major Component", in this context, means a major essential component
|
||||||
|
(kernel, window system, and so on) of the specific operating system
|
||||||
|
(if any) on which the executable work runs, or a compiler used to
|
||||||
|
produce the work, or an object code interpreter used to run it.
|
||||||
|
|
||||||
|
The "Corresponding Source" for a work in object code form means all
|
||||||
|
the source code needed to generate, install, and (for an executable
|
||||||
|
work) run the object code and to modify the work, including scripts to
|
||||||
|
control those activities. However, it does not include the work's
|
||||||
|
System Libraries, or general-purpose tools or generally available free
|
||||||
|
programs which are used unmodified in performing those activities but
|
||||||
|
which are not part of the work. For example, Corresponding Source
|
||||||
|
includes interface definition files associated with source files for
|
||||||
|
the work, and the source code for shared libraries and dynamically
|
||||||
|
linked subprograms that the work is specifically designed to require,
|
||||||
|
such as by intimate data communication or control flow between those
|
||||||
|
subprograms and other parts of the work.
|
||||||
|
|
||||||
|
The Corresponding Source need not include anything that users
|
||||||
|
can regenerate automatically from other parts of the Corresponding
|
||||||
|
Source.
|
||||||
|
|
||||||
|
The Corresponding Source for a work in source code form is that
|
||||||
|
same work.
|
||||||
|
|
||||||
|
2. Basic Permissions.
|
||||||
|
|
||||||
|
All rights granted under this License are granted for the term of
|
||||||
|
copyright on the Program, and are irrevocable provided the stated
|
||||||
|
conditions are met. This License explicitly affirms your unlimited
|
||||||
|
permission to run the unmodified Program. The output from running a
|
||||||
|
covered work is covered by this License only if the output, given its
|
||||||
|
content, constitutes a covered work. This License acknowledges your
|
||||||
|
rights of fair use or other equivalent, as provided by copyright law.
|
||||||
|
|
||||||
|
You may make, run and propagate covered works that you do not
|
||||||
|
convey, without conditions so long as your license otherwise remains
|
||||||
|
in force. You may convey covered works to others for the sole purpose
|
||||||
|
of having them make modifications exclusively for you, or provide you
|
||||||
|
with facilities for running those works, provided that you comply with
|
||||||
|
the terms of this License in conveying all material for which you do
|
||||||
|
not control copyright. Those thus making or running the covered works
|
||||||
|
for you must do so exclusively on your behalf, under your direction
|
||||||
|
and control, on terms that prohibit them from making any copies of
|
||||||
|
your copyrighted material outside their relationship with you.
|
||||||
|
|
||||||
|
Conveying under any other circumstances is permitted solely under
|
||||||
|
the conditions stated below. Sublicensing is not allowed; section 10
|
||||||
|
makes it unnecessary.
|
||||||
|
|
||||||
|
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||||
|
|
||||||
|
No covered work shall be deemed part of an effective technological
|
||||||
|
measure under any applicable law fulfilling obligations under article
|
||||||
|
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||||
|
similar laws prohibiting or restricting circumvention of such
|
||||||
|
measures.
|
||||||
|
|
||||||
|
When you convey a covered work, you waive any legal power to forbid
|
||||||
|
circumvention of technological measures to the extent such circumvention
|
||||||
|
is effected by exercising rights under this License with respect to
|
||||||
|
the covered work, and you disclaim any intention to limit operation or
|
||||||
|
modification of the work as a means of enforcing, against the work's
|
||||||
|
users, your or third parties' legal rights to forbid circumvention of
|
||||||
|
technological measures.
|
||||||
|
|
||||||
|
4. Conveying Verbatim Copies.
|
||||||
|
|
||||||
|
You may convey verbatim copies of the Program's source code as you
|
||||||
|
receive it, in any medium, provided that you conspicuously and
|
||||||
|
appropriately publish on each copy an appropriate copyright notice;
|
||||||
|
keep intact all notices stating that this License and any
|
||||||
|
non-permissive terms added in accord with section 7 apply to the code;
|
||||||
|
keep intact all notices of the absence of any warranty; and give all
|
||||||
|
recipients a copy of this License along with the Program.
|
||||||
|
|
||||||
|
You may charge any price or no price for each copy that you convey,
|
||||||
|
and you may offer support or warranty protection for a fee.
|
||||||
|
|
||||||
|
5. Conveying Modified Source Versions.
|
||||||
|
|
||||||
|
You may convey a work based on the Program, or the modifications to
|
||||||
|
produce it from the Program, in the form of source code under the
|
||||||
|
terms of section 4, provided that you also meet all of these conditions:
|
||||||
|
|
||||||
|
a) The work must carry prominent notices stating that you modified
|
||||||
|
it, and giving a relevant date.
|
||||||
|
|
||||||
|
b) The work must carry prominent notices stating that it is
|
||||||
|
released under this License and any conditions added under section
|
||||||
|
7. This requirement modifies the requirement in section 4 to
|
||||||
|
"keep intact all notices".
|
||||||
|
|
||||||
|
c) You must license the entire work, as a whole, under this
|
||||||
|
License to anyone who comes into possession of a copy. This
|
||||||
|
License will therefore apply, along with any applicable section 7
|
||||||
|
additional terms, to the whole of the work, and all its parts,
|
||||||
|
regardless of how they are packaged. This License gives no
|
||||||
|
permission to license the work in any other way, but it does not
|
||||||
|
invalidate such permission if you have separately received it.
|
||||||
|
|
||||||
|
d) If the work has interactive user interfaces, each must display
|
||||||
|
Appropriate Legal Notices; however, if the Program has interactive
|
||||||
|
interfaces that do not display Appropriate Legal Notices, your
|
||||||
|
work need not make them do so.
|
||||||
|
|
||||||
|
A compilation of a covered work with other separate and independent
|
||||||
|
works, which are not by their nature extensions of the covered work,
|
||||||
|
and which are not combined with it such as to form a larger program,
|
||||||
|
in or on a volume of a storage or distribution medium, is called an
|
||||||
|
"aggregate" if the compilation and its resulting copyright are not
|
||||||
|
used to limit the access or legal rights of the compilation's users
|
||||||
|
beyond what the individual works permit. Inclusion of a covered work
|
||||||
|
in an aggregate does not cause this License to apply to the other
|
||||||
|
parts of the aggregate.
|
||||||
|
|
||||||
|
6. Conveying Non-Source Forms.
|
||||||
|
|
||||||
|
You may convey a covered work in object code form under the terms
|
||||||
|
of sections 4 and 5, provided that you also convey the
|
||||||
|
machine-readable Corresponding Source under the terms of this License,
|
||||||
|
in one of these ways:
|
||||||
|
|
||||||
|
a) Convey the object code in, or embodied in, a physical product
|
||||||
|
(including a physical distribution medium), accompanied by the
|
||||||
|
Corresponding Source fixed on a durable physical medium
|
||||||
|
customarily used for software interchange.
|
||||||
|
|
||||||
|
b) Convey the object code in, or embodied in, a physical product
|
||||||
|
(including a physical distribution medium), accompanied by a
|
||||||
|
written offer, valid for at least three years and valid for as
|
||||||
|
long as you offer spare parts or customer support for that product
|
||||||
|
model, to give anyone who possesses the object code either (1) a
|
||||||
|
copy of the Corresponding Source for all the software in the
|
||||||
|
product that is covered by this License, on a durable physical
|
||||||
|
medium customarily used for software interchange, for a price no
|
||||||
|
more than your reasonable cost of physically performing this
|
||||||
|
conveying of source, or (2) access to copy the
|
||||||
|
Corresponding Source from a network server at no charge.
|
||||||
|
|
||||||
|
c) Convey individual copies of the object code with a copy of the
|
||||||
|
written offer to provide the Corresponding Source. This
|
||||||
|
alternative is allowed only occasionally and noncommercially, and
|
||||||
|
only if you received the object code with such an offer, in accord
|
||||||
|
with subsection 6b.
|
||||||
|
|
||||||
|
d) Convey the object code by offering access from a designated
|
||||||
|
place (gratis or for a charge), and offer equivalent access to the
|
||||||
|
Corresponding Source in the same way through the same place at no
|
||||||
|
further charge. You need not require recipients to copy the
|
||||||
|
Corresponding Source along with the object code. If the place to
|
||||||
|
copy the object code is a network server, the Corresponding Source
|
||||||
|
may be on a different server (operated by you or a third party)
|
||||||
|
that supports equivalent copying facilities, provided you maintain
|
||||||
|
clear directions next to the object code saying where to find the
|
||||||
|
Corresponding Source. Regardless of what server hosts the
|
||||||
|
Corresponding Source, you remain obligated to ensure that it is
|
||||||
|
available for as long as needed to satisfy these requirements.
|
||||||
|
|
||||||
|
e) Convey the object code using peer-to-peer transmission, provided
|
||||||
|
you inform other peers where the object code and Corresponding
|
||||||
|
Source of the work are being offered to the general public at no
|
||||||
|
charge under subsection 6d.
|
||||||
|
|
||||||
|
A separable portion of the object code, whose source code is excluded
|
||||||
|
from the Corresponding Source as a System Library, need not be
|
||||||
|
included in conveying the object code work.
|
||||||
|
|
||||||
|
A "User Product" is either (1) a "consumer product", which means any
|
||||||
|
tangible personal property which is normally used for personal, family,
|
||||||
|
or household purposes, or (2) anything designed or sold for incorporation
|
||||||
|
into a dwelling. In determining whether a product is a consumer product,
|
||||||
|
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||||
|
product received by a particular user, "normally used" refers to a
|
||||||
|
typical or common use of that class of product, regardless of the status
|
||||||
|
of the particular user or of the way in which the particular user
|
||||||
|
actually uses, or expects or is expected to use, the product. A product
|
||||||
|
is a consumer product regardless of whether the product has substantial
|
||||||
|
commercial, industrial or non-consumer uses, unless such uses represent
|
||||||
|
the only significant mode of use of the product.
|
||||||
|
|
||||||
|
"Installation Information" for a User Product means any methods,
|
||||||
|
procedures, authorization keys, or other information required to install
|
||||||
|
and execute modified versions of a covered work in that User Product from
|
||||||
|
a modified version of its Corresponding Source. The information must
|
||||||
|
suffice to ensure that the continued functioning of the modified object
|
||||||
|
code is in no case prevented or interfered with solely because
|
||||||
|
modification has been made.
|
||||||
|
|
||||||
|
If you convey an object code work under this section in, or with, or
|
||||||
|
specifically for use in, a User Product, and the conveying occurs as
|
||||||
|
part of a transaction in which the right of possession and use of the
|
||||||
|
User Product is transferred to the recipient in perpetuity or for a
|
||||||
|
fixed term (regardless of how the transaction is characterized), the
|
||||||
|
Corresponding Source conveyed under this section must be accompanied
|
||||||
|
by the Installation Information. But this requirement does not apply
|
||||||
|
if neither you nor any third party retains the ability to install
|
||||||
|
modified object code on the User Product (for example, the work has
|
||||||
|
been installed in ROM).
|
||||||
|
|
||||||
|
The requirement to provide Installation Information does not include a
|
||||||
|
requirement to continue to provide support service, warranty, or updates
|
||||||
|
for a work that has been modified or installed by the recipient, or for
|
||||||
|
the User Product in which it has been modified or installed. Access to a
|
||||||
|
network may be denied when the modification itself materially and
|
||||||
|
adversely affects the operation of the network or violates the rules and
|
||||||
|
protocols for communication across the network.
|
||||||
|
|
||||||
|
Corresponding Source conveyed, and Installation Information provided,
|
||||||
|
in accord with this section must be in a format that is publicly
|
||||||
|
documented (and with an implementation available to the public in
|
||||||
|
source code form), and must require no special password or key for
|
||||||
|
unpacking, reading or copying.
|
||||||
|
|
||||||
|
7. Additional Terms.
|
||||||
|
|
||||||
|
"Additional permissions" are terms that supplement the terms of this
|
||||||
|
License by making exceptions from one or more of its conditions.
|
||||||
|
Additional permissions that are applicable to the entire Program shall
|
||||||
|
be treated as though they were included in this License, to the extent
|
||||||
|
that they are valid under applicable law. If additional permissions
|
||||||
|
apply only to part of the Program, that part may be used separately
|
||||||
|
under those permissions, but the entire Program remains governed by
|
||||||
|
this License without regard to the additional permissions.
|
||||||
|
|
||||||
|
When you convey a copy of a covered work, you may at your option
|
||||||
|
remove any additional permissions from that copy, or from any part of
|
||||||
|
it. (Additional permissions may be written to require their own
|
||||||
|
removal in certain cases when you modify the work.) You may place
|
||||||
|
additional permissions on material, added by you to a covered work,
|
||||||
|
for which you have or can give appropriate copyright permission.
|
||||||
|
|
||||||
|
Notwithstanding any other provision of this License, for material you
|
||||||
|
add to a covered work, you may (if authorized by the copyright holders of
|
||||||
|
that material) supplement the terms of this License with terms:
|
||||||
|
|
||||||
|
a) Disclaiming warranty or limiting liability differently from the
|
||||||
|
terms of sections 15 and 16 of this License; or
|
||||||
|
|
||||||
|
b) Requiring preservation of specified reasonable legal notices or
|
||||||
|
author attributions in that material or in the Appropriate Legal
|
||||||
|
Notices displayed by works containing it; or
|
||||||
|
|
||||||
|
c) Prohibiting misrepresentation of the origin of that material, or
|
||||||
|
requiring that modified versions of such material be marked in
|
||||||
|
reasonable ways as different from the original version; or
|
||||||
|
|
||||||
|
d) Limiting the use for publicity purposes of names of licensors or
|
||||||
|
authors of the material; or
|
||||||
|
|
||||||
|
e) Declining to grant rights under trademark law for use of some
|
||||||
|
trade names, trademarks, or service marks; or
|
||||||
|
|
||||||
|
f) Requiring indemnification of licensors and authors of that
|
||||||
|
material by anyone who conveys the material (or modified versions of
|
||||||
|
it) with contractual assumptions of liability to the recipient, for
|
||||||
|
any liability that these contractual assumptions directly impose on
|
||||||
|
those licensors and authors.
|
||||||
|
|
||||||
|
All other non-permissive additional terms are considered "further
|
||||||
|
restrictions" within the meaning of section 10. If the Program as you
|
||||||
|
received it, or any part of it, contains a notice stating that it is
|
||||||
|
governed by this License along with a term that is a further
|
||||||
|
restriction, you may remove that term. If a license document contains
|
||||||
|
a further restriction but permits relicensing or conveying under this
|
||||||
|
License, you may add to a covered work material governed by the terms
|
||||||
|
of that license document, provided that the further restriction does
|
||||||
|
not survive such relicensing or conveying.
|
||||||
|
|
||||||
|
If you add terms to a covered work in accord with this section, you
|
||||||
|
must place, in the relevant source files, a statement of the
|
||||||
|
additional terms that apply to those files, or a notice indicating
|
||||||
|
where to find the applicable terms.
|
||||||
|
|
||||||
|
Additional terms, permissive or non-permissive, may be stated in the
|
||||||
|
form of a separately written license, or stated as exceptions;
|
||||||
|
the above requirements apply either way.
|
||||||
|
|
||||||
|
8. Termination.
|
||||||
|
|
||||||
|
You may not propagate or modify a covered work except as expressly
|
||||||
|
provided under this License. Any attempt otherwise to propagate or
|
||||||
|
modify it is void, and will automatically terminate your rights under
|
||||||
|
this License (including any patent licenses granted under the third
|
||||||
|
paragraph of section 11).
|
||||||
|
|
||||||
|
However, if you cease all violation of this License, then your
|
||||||
|
license from a particular copyright holder is reinstated (a)
|
||||||
|
provisionally, unless and until the copyright holder explicitly and
|
||||||
|
finally terminates your license, and (b) permanently, if the copyright
|
||||||
|
holder fails to notify you of the violation by some reasonable means
|
||||||
|
prior to 60 days after the cessation.
|
||||||
|
|
||||||
|
Moreover, your license from a particular copyright holder is
|
||||||
|
reinstated permanently if the copyright holder notifies you of the
|
||||||
|
violation by some reasonable means, this is the first time you have
|
||||||
|
received notice of violation of this License (for any work) from that
|
||||||
|
copyright holder, and you cure the violation prior to 30 days after
|
||||||
|
your receipt of the notice.
|
||||||
|
|
||||||
|
Termination of your rights under this section does not terminate the
|
||||||
|
licenses of parties who have received copies or rights from you under
|
||||||
|
this License. If your rights have been terminated and not permanently
|
||||||
|
reinstated, you do not qualify to receive new licenses for the same
|
||||||
|
material under section 10.
|
||||||
|
|
||||||
|
9. Acceptance Not Required for Having Copies.
|
||||||
|
|
||||||
|
You are not required to accept this License in order to receive or
|
||||||
|
run a copy of the Program. Ancillary propagation of a covered work
|
||||||
|
occurring solely as a consequence of using peer-to-peer transmission
|
||||||
|
to receive a copy likewise does not require acceptance. However,
|
||||||
|
nothing other than this License grants you permission to propagate or
|
||||||
|
modify any covered work. These actions infringe copyright if you do
|
||||||
|
not accept this License. Therefore, by modifying or propagating a
|
||||||
|
covered work, you indicate your acceptance of this License to do so.
|
||||||
|
|
||||||
|
10. Automatic Licensing of Downstream Recipients.
|
||||||
|
|
||||||
|
Each time you convey a covered work, the recipient automatically
|
||||||
|
receives a license from the original licensors, to run, modify and
|
||||||
|
propagate that work, subject to this License. You are not responsible
|
||||||
|
for enforcing compliance by third parties with this License.
|
||||||
|
|
||||||
|
An "entity transaction" is a transaction transferring control of an
|
||||||
|
organization, or substantially all assets of one, or subdividing an
|
||||||
|
organization, or merging organizations. If propagation of a covered
|
||||||
|
work results from an entity transaction, each party to that
|
||||||
|
transaction who receives a copy of the work also receives whatever
|
||||||
|
licenses to the work the party's predecessor in interest had or could
|
||||||
|
give under the previous paragraph, plus a right to possession of the
|
||||||
|
Corresponding Source of the work from the predecessor in interest, if
|
||||||
|
the predecessor has it or can get it with reasonable efforts.
|
||||||
|
|
||||||
|
You may not impose any further restrictions on the exercise of the
|
||||||
|
rights granted or affirmed under this License. For example, you may
|
||||||
|
not impose a license fee, royalty, or other charge for exercise of
|
||||||
|
rights granted under this License, and you may not initiate litigation
|
||||||
|
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||||
|
any patent claim is infringed by making, using, selling, offering for
|
||||||
|
sale, or importing the Program or any portion of it.
|
||||||
|
|
||||||
|
11. Patents.
|
||||||
|
|
||||||
|
A "contributor" is a copyright holder who authorizes use under this
|
||||||
|
License of the Program or a work on which the Program is based. The
|
||||||
|
work thus licensed is called the contributor's "contributor version".
|
||||||
|
|
||||||
|
A contributor's "essential patent claims" are all patent claims
|
||||||
|
owned or controlled by the contributor, whether already acquired or
|
||||||
|
hereafter acquired, that would be infringed by some manner, permitted
|
||||||
|
by this License, of making, using, or selling its contributor version,
|
||||||
|
but do not include claims that would be infringed only as a
|
||||||
|
consequence of further modification of the contributor version. For
|
||||||
|
purposes of this definition, "control" includes the right to grant
|
||||||
|
patent sublicenses in a manner consistent with the requirements of
|
||||||
|
this License.
|
||||||
|
|
||||||
|
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||||
|
patent license under the contributor's essential patent claims, to
|
||||||
|
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||||
|
propagate the contents of its contributor version.
|
||||||
|
|
||||||
|
In the following three paragraphs, a "patent license" is any express
|
||||||
|
agreement or commitment, however denominated, not to enforce a patent
|
||||||
|
(such as an express permission to practice a patent or covenant not to
|
||||||
|
sue for patent infringement). To "grant" such a patent license to a
|
||||||
|
party means to make such an agreement or commitment not to enforce a
|
||||||
|
patent against the party.
|
||||||
|
|
||||||
|
If you convey a covered work, knowingly relying on a patent license,
|
||||||
|
and the Corresponding Source of the work is not available for anyone
|
||||||
|
to copy, free of charge and under the terms of this License, through a
|
||||||
|
publicly available network server or other readily accessible means,
|
||||||
|
then you must either (1) cause the Corresponding Source to be so
|
||||||
|
available, or (2) arrange to deprive yourself of the benefit of the
|
||||||
|
patent license for this particular work, or (3) arrange, in a manner
|
||||||
|
consistent with the requirements of this License, to extend the patent
|
||||||
|
license to downstream recipients. "Knowingly relying" means you have
|
||||||
|
actual knowledge that, but for the patent license, your conveying the
|
||||||
|
covered work in a country, or your recipient's use of the covered work
|
||||||
|
in a country, would infringe one or more identifiable patents in that
|
||||||
|
country that you have reason to believe are valid.
|
||||||
|
|
||||||
|
If, pursuant to or in connection with a single transaction or
|
||||||
|
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||||
|
covered work, and grant a patent license to some of the parties
|
||||||
|
receiving the covered work authorizing them to use, propagate, modify
|
||||||
|
or convey a specific copy of the covered work, then the patent license
|
||||||
|
you grant is automatically extended to all recipients of the covered
|
||||||
|
work and works based on it.
|
||||||
|
|
||||||
|
A patent license is "discriminatory" if it does not include within
|
||||||
|
the scope of its coverage, prohibits the exercise of, or is
|
||||||
|
conditioned on the non-exercise of one or more of the rights that are
|
||||||
|
specifically granted under this License. You may not convey a covered
|
||||||
|
work if you are a party to an arrangement with a third party that is
|
||||||
|
in the business of distributing software, under which you make payment
|
||||||
|
to the third party based on the extent of your activity of conveying
|
||||||
|
the work, and under which the third party grants, to any of the
|
||||||
|
parties who would receive the covered work from you, a discriminatory
|
||||||
|
patent license (a) in connection with copies of the covered work
|
||||||
|
conveyed by you (or copies made from those copies), or (b) primarily
|
||||||
|
for and in connection with specific products or compilations that
|
||||||
|
contain the covered work, unless you entered into that arrangement,
|
||||||
|
or that patent license was granted, prior to 28 March 2007.
|
||||||
|
|
||||||
|
Nothing in this License shall be construed as excluding or limiting
|
||||||
|
any implied license or other defenses to infringement that may
|
||||||
|
otherwise be available to you under applicable patent law.
|
||||||
|
|
||||||
|
12. No Surrender of Others' Freedom.
|
||||||
|
|
||||||
|
If conditions are imposed on you (whether by court order, agreement or
|
||||||
|
otherwise) that contradict the conditions of this License, they do not
|
||||||
|
excuse you from the conditions of this License. If you cannot convey a
|
||||||
|
covered work so as to satisfy simultaneously your obligations under this
|
||||||
|
License and any other pertinent obligations, then as a consequence you may
|
||||||
|
not convey it at all. For example, if you agree to terms that obligate you
|
||||||
|
to collect a royalty for further conveying from those to whom you convey
|
||||||
|
the Program, the only way you could satisfy both those terms and this
|
||||||
|
License would be to refrain entirely from conveying the Program.
|
||||||
|
|
||||||
|
13. Remote Network Interaction; Use with the GNU General Public License.
|
||||||
|
|
||||||
|
Notwithstanding any other provision of this License, if you modify the
|
||||||
|
Program, your modified version must prominently offer all users
|
||||||
|
interacting with it remotely through a computer network (if your version
|
||||||
|
supports such interaction) an opportunity to receive the Corresponding
|
||||||
|
Source of your version by providing access to the Corresponding Source
|
||||||
|
from a network server at no charge, through some standard or customary
|
||||||
|
means of facilitating copying of software. This Corresponding Source
|
||||||
|
shall include the Corresponding Source for any work covered by version 3
|
||||||
|
of the GNU General Public License that is incorporated pursuant to the
|
||||||
|
following paragraph.
|
||||||
|
|
||||||
|
Notwithstanding any other provision of this License, you have
|
||||||
|
permission to link or combine any covered work with a work licensed
|
||||||
|
under version 3 of the GNU General Public License into a single
|
||||||
|
combined work, and to convey the resulting work. The terms of this
|
||||||
|
License will continue to apply to the part which is the covered work,
|
||||||
|
but the work with which it is combined will remain governed by version
|
||||||
|
3 of the GNU General Public License.
|
||||||
|
|
||||||
|
14. Revised Versions of this License.
|
||||||
|
|
||||||
|
The Free Software Foundation may publish revised and/or new versions of
|
||||||
|
the GNU Affero General Public License from time to time. Such new versions
|
||||||
|
will be similar in spirit to the present version, but may differ in detail to
|
||||||
|
address new problems or concerns.
|
||||||
|
|
||||||
|
Each version is given a distinguishing version number. If the
|
||||||
|
Program specifies that a certain numbered version of the GNU Affero General
|
||||||
|
Public License "or any later version" applies to it, you have the
|
||||||
|
option of following the terms and conditions either of that numbered
|
||||||
|
version or of any later version published by the Free Software
|
||||||
|
Foundation. If the Program does not specify a version number of the
|
||||||
|
GNU Affero General Public License, you may choose any version ever published
|
||||||
|
by the Free Software Foundation.
|
||||||
|
|
||||||
|
If the Program specifies that a proxy can decide which future
|
||||||
|
versions of the GNU Affero General Public License can be used, that proxy's
|
||||||
|
public statement of acceptance of a version permanently authorizes you
|
||||||
|
to choose that version for the Program.
|
||||||
|
|
||||||
|
Later license versions may give you additional or different
|
||||||
|
permissions. However, no additional obligations are imposed on any
|
||||||
|
author or copyright holder as a result of your choosing to follow a
|
||||||
|
later version.
|
||||||
|
|
||||||
|
15. Disclaimer of Warranty.
|
||||||
|
|
||||||
|
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||||
|
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||||
|
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||||
|
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||||
|
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||||
|
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||||
|
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||||
|
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||||
|
|
||||||
|
16. Limitation of Liability.
|
||||||
|
|
||||||
|
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||||
|
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||||
|
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||||
|
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||||
|
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||||
|
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||||
|
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||||
|
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||||
|
SUCH DAMAGES.
|
||||||
|
|
||||||
|
17. Interpretation of Sections 15 and 16.
|
||||||
|
|
||||||
|
If the disclaimer of warranty and limitation of liability provided
|
||||||
|
above cannot be given local legal effect according to their terms,
|
||||||
|
reviewing courts shall apply local law that most closely approximates
|
||||||
|
an absolute waiver of all civil liability in connection with the
|
||||||
|
Program, unless a warranty or assumption of liability accompanies a
|
||||||
|
copy of the Program in return for a fee.
|
||||||
|
|
||||||
|
END OF TERMS AND CONDITIONS
|
||||||
|
|
||||||
|
How to Apply These Terms to Your New Programs
|
||||||
|
|
||||||
|
If you develop a new program, and you want it to be of the greatest
|
||||||
|
possible use to the public, the best way to achieve this is to make it
|
||||||
|
free software which everyone can redistribute and change under these terms.
|
||||||
|
|
||||||
|
To do so, attach the following notices to the program. It is safest
|
||||||
|
to attach them to the start of each source file to most effectively
|
||||||
|
state the exclusion of warranty; and each file should have at least
|
||||||
|
the "copyright" line and a pointer to where the full notice is found.
|
||||||
|
|
||||||
|
<one line to give the program's name and a brief idea of what it does.>
|
||||||
|
Copyright (C) <year> <name of author>
|
||||||
|
|
||||||
|
This program is free software: you can redistribute it and/or modify
|
||||||
|
it under the terms of the GNU Affero General Public License as published by
|
||||||
|
the Free Software Foundation, either version 3 of the License, or
|
||||||
|
(at your option) any later version.
|
||||||
|
|
||||||
|
This program is distributed in the hope that it will be useful,
|
||||||
|
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
GNU Affero General Public License for more details.
|
||||||
|
|
||||||
|
You should have received a copy of the GNU Affero General Public License
|
||||||
|
along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
Also add information on how to contact you by electronic and paper mail.
|
||||||
|
|
||||||
|
If your software can interact with users remotely through a computer
|
||||||
|
network, you should also make sure that it provides a way for users to
|
||||||
|
get its source. For example, if your program is a web application, its
|
||||||
|
interface could display a "Source" link that leads users to an archive
|
||||||
|
of the code. There are many ways you could offer source, and different
|
||||||
|
solutions will be better for different programs; see section 13 for the
|
||||||
|
specific requirements.
|
||||||
|
|
||||||
|
You should also get your employer (if you work as a programmer) or school,
|
||||||
|
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||||
|
For more information on this, and how to apply and follow the GNU AGPL, see
|
||||||
|
<https://www.gnu.org/licenses/>.
|
9
README.md
Normal file
9
README.md
Normal file
|
@ -0,0 +1,9 @@
|
||||||
|
# microblog.pub
|
||||||
|
|
||||||
|
This branch is a complete rewrite of the original microblog.pub server.
|
||||||
|
|
||||||
|
The original server became hard to debug, maintain and is not super easy to deploy (due to the dependecies like MongoDB).
|
||||||
|
|
||||||
|
This rewrite is built using "modern" Python 3.10, SQLite and does not need any external tasks queue service.
|
||||||
|
|
||||||
|
It is still in early development, this README will be updated when I get to deploy a personal instance in the wild.
|
105
alembic.ini
Normal file
105
alembic.ini
Normal file
|
@ -0,0 +1,105 @@
|
||||||
|
# A generic, single database configuration.
|
||||||
|
|
||||||
|
[alembic]
|
||||||
|
# path to migration scripts
|
||||||
|
script_location = alembic
|
||||||
|
|
||||||
|
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
|
||||||
|
# Uncomment the line below if you want the files to be prepended with date and time
|
||||||
|
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
|
||||||
|
# for all available tokens
|
||||||
|
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
|
||||||
|
|
||||||
|
# sys.path path, will be prepended to sys.path if present.
|
||||||
|
# defaults to the current working directory.
|
||||||
|
prepend_sys_path = .
|
||||||
|
|
||||||
|
# timezone to use when rendering the date within the migration file
|
||||||
|
# as well as the filename.
|
||||||
|
# If specified, requires the python-dateutil library that can be
|
||||||
|
# installed by adding `alembic[tz]` to the pip requirements
|
||||||
|
# string value is passed to dateutil.tz.gettz()
|
||||||
|
# leave blank for localtime
|
||||||
|
# timezone =
|
||||||
|
|
||||||
|
# max length of characters to apply to the
|
||||||
|
# "slug" field
|
||||||
|
# truncate_slug_length = 40
|
||||||
|
|
||||||
|
# set to 'true' to run the environment during
|
||||||
|
# the 'revision' command, regardless of autogenerate
|
||||||
|
# revision_environment = false
|
||||||
|
|
||||||
|
# set to 'true' to allow .pyc and .pyo files without
|
||||||
|
# a source .py file to be detected as revisions in the
|
||||||
|
# versions/ directory
|
||||||
|
# sourceless = false
|
||||||
|
|
||||||
|
# version location specification; This defaults
|
||||||
|
# to alembic/versions. When using multiple version
|
||||||
|
# directories, initial revisions must be specified with --version-path.
|
||||||
|
# The path separator used here should be the separator specified by "version_path_separator" below.
|
||||||
|
# version_locations = %(here)s/bar:%(here)s/bat:alembic/versions
|
||||||
|
|
||||||
|
# version path separator; As mentioned above, this is the character used to split
|
||||||
|
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
|
||||||
|
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
|
||||||
|
# Valid values for version_path_separator are:
|
||||||
|
#
|
||||||
|
# version_path_separator = :
|
||||||
|
# version_path_separator = ;
|
||||||
|
# version_path_separator = space
|
||||||
|
version_path_separator = os # Use os.pathsep. Default configuration used for new projects.
|
||||||
|
|
||||||
|
# the output encoding used when revision files
|
||||||
|
# are written from script.py.mako
|
||||||
|
# output_encoding = utf-8
|
||||||
|
|
||||||
|
sqlalchemy.url =
|
||||||
|
|
||||||
|
|
||||||
|
[post_write_hooks]
|
||||||
|
# post_write_hooks defines scripts or Python functions that are run
|
||||||
|
# on newly generated revision scripts. See the documentation for further
|
||||||
|
# detail and examples
|
||||||
|
|
||||||
|
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
||||||
|
# hooks = black
|
||||||
|
# black.type = console_scripts
|
||||||
|
# black.entrypoint = black
|
||||||
|
# black.options = -l 79 REVISION_SCRIPT_FILENAME
|
||||||
|
|
||||||
|
# Logging configuration
|
||||||
|
[loggers]
|
||||||
|
keys = root,sqlalchemy,alembic
|
||||||
|
|
||||||
|
[handlers]
|
||||||
|
keys = console
|
||||||
|
|
||||||
|
[formatters]
|
||||||
|
keys = generic
|
||||||
|
|
||||||
|
[logger_root]
|
||||||
|
level = WARN
|
||||||
|
handlers = console
|
||||||
|
qualname =
|
||||||
|
|
||||||
|
[logger_sqlalchemy]
|
||||||
|
level = WARN
|
||||||
|
handlers =
|
||||||
|
qualname = sqlalchemy.engine
|
||||||
|
|
||||||
|
[logger_alembic]
|
||||||
|
level = INFO
|
||||||
|
handlers =
|
||||||
|
qualname = alembic
|
||||||
|
|
||||||
|
[handler_console]
|
||||||
|
class = StreamHandler
|
||||||
|
args = (sys.stderr,)
|
||||||
|
level = NOTSET
|
||||||
|
formatter = generic
|
||||||
|
|
||||||
|
[formatter_generic]
|
||||||
|
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||||
|
datefmt = %H:%M:%S
|
1
alembic/README
Normal file
1
alembic/README
Normal file
|
@ -0,0 +1 @@
|
||||||
|
Generic single-database configuration.
|
81
alembic/env.py
Normal file
81
alembic/env.py
Normal file
|
@ -0,0 +1,81 @@
|
||||||
|
from logging.config import fileConfig
|
||||||
|
|
||||||
|
from sqlalchemy import engine_from_config
|
||||||
|
from sqlalchemy import pool
|
||||||
|
|
||||||
|
import app.models # noqa: F401 # Register models
|
||||||
|
from alembic import context
|
||||||
|
from app.database import SQLALCHEMY_DATABASE_URL
|
||||||
|
from app.database import Base
|
||||||
|
|
||||||
|
# this is the Alembic Config object, which provides
|
||||||
|
# access to the values within the .ini file in use.
|
||||||
|
config = context.config
|
||||||
|
|
||||||
|
# Interpret the config file for Python logging.
|
||||||
|
# This line sets up loggers basically.
|
||||||
|
if config.config_file_name is not None:
|
||||||
|
fileConfig(config.config_file_name)
|
||||||
|
|
||||||
|
config.set_main_option("sqlalchemy.url", SQLALCHEMY_DATABASE_URL)
|
||||||
|
|
||||||
|
# add your model's MetaData object here
|
||||||
|
# for 'autogenerate' support
|
||||||
|
# from myapp import mymodel
|
||||||
|
# target_metadata = mymodel.Base.metadata
|
||||||
|
target_metadata = Base.metadata
|
||||||
|
|
||||||
|
# other values from the config, defined by the needs of env.py,
|
||||||
|
# can be acquired:
|
||||||
|
# my_important_option = config.get_main_option("my_important_option")
|
||||||
|
# ... etc.
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_offline() -> None:
|
||||||
|
"""Run migrations in 'offline' mode.
|
||||||
|
|
||||||
|
This configures the context with just a URL
|
||||||
|
and not an Engine, though an Engine is acceptable
|
||||||
|
here as well. By skipping the Engine creation
|
||||||
|
we don't even need a DBAPI to be available.
|
||||||
|
|
||||||
|
Calls to context.execute() here emit the given string to the
|
||||||
|
script output.
|
||||||
|
|
||||||
|
"""
|
||||||
|
url = config.get_main_option("sqlalchemy.url")
|
||||||
|
context.configure(
|
||||||
|
url=url,
|
||||||
|
target_metadata=target_metadata,
|
||||||
|
literal_binds=True,
|
||||||
|
dialect_opts={"paramstyle": "named"},
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_online() -> None:
|
||||||
|
"""Run migrations in 'online' mode.
|
||||||
|
|
||||||
|
In this scenario we need to create an Engine
|
||||||
|
and associate a connection with the context.
|
||||||
|
|
||||||
|
"""
|
||||||
|
connectable = engine_from_config(
|
||||||
|
config.get_section(config.config_ini_section),
|
||||||
|
prefix="sqlalchemy.",
|
||||||
|
poolclass=pool.NullPool,
|
||||||
|
)
|
||||||
|
|
||||||
|
with connectable.connect() as connection:
|
||||||
|
context.configure(connection=connection, target_metadata=target_metadata)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
if context.is_offline_mode():
|
||||||
|
run_migrations_offline()
|
||||||
|
else:
|
||||||
|
run_migrations_online()
|
24
alembic/script.py.mako
Normal file
24
alembic/script.py.mako
Normal file
|
@ -0,0 +1,24 @@
|
||||||
|
"""${message}
|
||||||
|
|
||||||
|
Revision ID: ${up_revision}
|
||||||
|
Revises: ${down_revision | comma,n}
|
||||||
|
Create Date: ${create_date}
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
${imports if imports else ""}
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = ${repr(up_revision)}
|
||||||
|
down_revision = ${repr(down_revision)}
|
||||||
|
branch_labels = ${repr(branch_labels)}
|
||||||
|
depends_on = ${repr(depends_on)}
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
${upgrades if upgrades else "pass"}
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
${downgrades if downgrades else "pass"}
|
192
alembic/versions/b122c3a69fc9_initial_migration.py
Normal file
192
alembic/versions/b122c3a69fc9_initial_migration.py
Normal file
|
@ -0,0 +1,192 @@
|
||||||
|
"""Initial migration
|
||||||
|
|
||||||
|
Revision ID: b122c3a69fc9
|
||||||
|
Revises:
|
||||||
|
Create Date: 2022-06-22 19:54:19.153320
|
||||||
|
|
||||||
|
"""
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = 'b122c3a69fc9'
|
||||||
|
down_revision = None
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.create_table('actors',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('ap_id', sa.String(), nullable=False),
|
||||||
|
sa.Column('ap_actor', sa.JSON(), nullable=False),
|
||||||
|
sa.Column('ap_type', sa.String(), nullable=False),
|
||||||
|
sa.Column('handle', sa.String(), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_actors_ap_id'), 'actors', ['ap_id'], unique=True)
|
||||||
|
op.create_index(op.f('ix_actors_handle'), 'actors', ['handle'], unique=False)
|
||||||
|
op.create_index(op.f('ix_actors_id'), 'actors', ['id'], unique=False)
|
||||||
|
op.create_table('inbox',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('actor_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('server', sa.String(), nullable=False),
|
||||||
|
sa.Column('is_hidden_from_stream', sa.Boolean(), nullable=False),
|
||||||
|
sa.Column('ap_actor_id', sa.String(), nullable=False),
|
||||||
|
sa.Column('ap_type', sa.String(), nullable=False),
|
||||||
|
sa.Column('ap_id', sa.String(), nullable=False),
|
||||||
|
sa.Column('ap_context', sa.String(), nullable=True),
|
||||||
|
sa.Column('ap_published_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('ap_object', sa.JSON(), nullable=False),
|
||||||
|
sa.Column('activity_object_ap_id', sa.String(), nullable=True),
|
||||||
|
sa.Column('visibility', sa.Enum('PUBLIC', 'UNLISTED', 'DIRECT', name='visibilityenum'), nullable=False),
|
||||||
|
sa.Column('relates_to_inbox_object_id', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('relates_to_outbox_object_id', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('undone_by_inbox_object_id', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('liked_via_outbox_object_ap_id', sa.String(), nullable=True),
|
||||||
|
sa.Column('announced_via_outbox_object_ap_id', sa.String(), nullable=True),
|
||||||
|
sa.Column('is_bookmarked', sa.Boolean(), nullable=False),
|
||||||
|
sa.Column('has_replies', sa.Boolean(), nullable=False),
|
||||||
|
sa.Column('og_meta', sa.JSON(), nullable=True),
|
||||||
|
sa.ForeignKeyConstraint(['actor_id'], ['actors.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['relates_to_inbox_object_id'], ['inbox.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['relates_to_outbox_object_id'], ['outbox.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['undone_by_inbox_object_id'], ['inbox.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_inbox_ap_id'), 'inbox', ['ap_id'], unique=True)
|
||||||
|
op.create_index(op.f('ix_inbox_id'), 'inbox', ['id'], unique=False)
|
||||||
|
op.create_table('outbox',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('is_hidden_from_homepage', sa.Boolean(), nullable=False),
|
||||||
|
sa.Column('public_id', sa.String(), nullable=False),
|
||||||
|
sa.Column('ap_type', sa.String(), nullable=False),
|
||||||
|
sa.Column('ap_id', sa.String(), nullable=False),
|
||||||
|
sa.Column('ap_context', sa.String(), nullable=True),
|
||||||
|
sa.Column('ap_object', sa.JSON(), nullable=False),
|
||||||
|
sa.Column('activity_object_ap_id', sa.String(), nullable=True),
|
||||||
|
sa.Column('source', sa.String(), nullable=True),
|
||||||
|
sa.Column('ap_published_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('visibility', sa.Enum('PUBLIC', 'UNLISTED', 'DIRECT', name='visibilityenum'), nullable=False),
|
||||||
|
sa.Column('likes_count', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('announces_count', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('replies_count', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('webmentions', sa.JSON(), nullable=True),
|
||||||
|
sa.Column('og_meta', sa.JSON(), nullable=True),
|
||||||
|
sa.Column('is_deleted', sa.Boolean(), nullable=False),
|
||||||
|
sa.Column('relates_to_inbox_object_id', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('relates_to_outbox_object_id', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('undone_by_outbox_object_id', sa.Integer(), nullable=True),
|
||||||
|
sa.ForeignKeyConstraint(['relates_to_inbox_object_id'], ['inbox.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['relates_to_outbox_object_id'], ['outbox.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['undone_by_outbox_object_id'], ['outbox.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_outbox_ap_id'), 'outbox', ['ap_id'], unique=True)
|
||||||
|
op.create_index(op.f('ix_outbox_id'), 'outbox', ['id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_outbox_public_id'), 'outbox', ['public_id'], unique=False)
|
||||||
|
op.create_table('followers',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('actor_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('inbox_object_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('ap_actor_id', sa.String(), nullable=False),
|
||||||
|
sa.ForeignKeyConstraint(['actor_id'], ['actors.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['inbox_object_id'], ['inbox.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id'),
|
||||||
|
sa.UniqueConstraint('actor_id'),
|
||||||
|
sa.UniqueConstraint('ap_actor_id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_followers_id'), 'followers', ['id'], unique=False)
|
||||||
|
op.create_table('following',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('actor_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('outbox_object_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('ap_actor_id', sa.String(), nullable=False),
|
||||||
|
sa.ForeignKeyConstraint(['actor_id'], ['actors.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['outbox_object_id'], ['outbox.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id'),
|
||||||
|
sa.UniqueConstraint('actor_id'),
|
||||||
|
sa.UniqueConstraint('ap_actor_id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_following_id'), 'following', ['id'], unique=False)
|
||||||
|
op.create_table('notifications',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('notification_type', sa.Enum('NEW_FOLLOWER', 'UNFOLLOW', 'LIKE', 'UNDO_LIKE', 'ANNOUNCE', 'UNDO_ANNOUNCE', 'MENTION', name='notificationtype'), nullable=True),
|
||||||
|
sa.Column('is_new', sa.Boolean(), nullable=False),
|
||||||
|
sa.Column('actor_id', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('outbox_object_id', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('inbox_object_id', sa.Integer(), nullable=True),
|
||||||
|
sa.ForeignKeyConstraint(['actor_id'], ['actors.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['inbox_object_id'], ['inbox.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['outbox_object_id'], ['outbox.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_notifications_id'), 'notifications', ['id'], unique=False)
|
||||||
|
op.create_table('outgoing_activities',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('recipient', sa.String(), nullable=False),
|
||||||
|
sa.Column('outbox_object_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('tries', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('next_try', sa.DateTime(timezone=True), nullable=True),
|
||||||
|
sa.Column('last_try', sa.DateTime(timezone=True), nullable=True),
|
||||||
|
sa.Column('last_status_code', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('last_response', sa.String(), nullable=True),
|
||||||
|
sa.Column('is_sent', sa.Boolean(), nullable=False),
|
||||||
|
sa.Column('is_errored', sa.Boolean(), nullable=False),
|
||||||
|
sa.Column('error', sa.String(), nullable=True),
|
||||||
|
sa.ForeignKeyConstraint(['outbox_object_id'], ['outbox.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_outgoing_activities_id'), 'outgoing_activities', ['id'], unique=False)
|
||||||
|
op.create_table('tagged_outbox_objects',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('outbox_object_id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('tag', sa.String(), nullable=False),
|
||||||
|
sa.ForeignKeyConstraint(['outbox_object_id'], ['outbox.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id'),
|
||||||
|
sa.UniqueConstraint('outbox_object_id', 'tag', name='uix_tagged_object')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_tagged_outbox_objects_id'), 'tagged_outbox_objects', ['id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_tagged_outbox_objects_tag'), 'tagged_outbox_objects', ['tag'], unique=False)
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.drop_index(op.f('ix_tagged_outbox_objects_tag'), table_name='tagged_outbox_objects')
|
||||||
|
op.drop_index(op.f('ix_tagged_outbox_objects_id'), table_name='tagged_outbox_objects')
|
||||||
|
op.drop_table('tagged_outbox_objects')
|
||||||
|
op.drop_index(op.f('ix_outgoing_activities_id'), table_name='outgoing_activities')
|
||||||
|
op.drop_table('outgoing_activities')
|
||||||
|
op.drop_index(op.f('ix_notifications_id'), table_name='notifications')
|
||||||
|
op.drop_table('notifications')
|
||||||
|
op.drop_index(op.f('ix_following_id'), table_name='following')
|
||||||
|
op.drop_table('following')
|
||||||
|
op.drop_index(op.f('ix_followers_id'), table_name='followers')
|
||||||
|
op.drop_table('followers')
|
||||||
|
op.drop_index(op.f('ix_outbox_public_id'), table_name='outbox')
|
||||||
|
op.drop_index(op.f('ix_outbox_id'), table_name='outbox')
|
||||||
|
op.drop_index(op.f('ix_outbox_ap_id'), table_name='outbox')
|
||||||
|
op.drop_table('outbox')
|
||||||
|
op.drop_index(op.f('ix_inbox_id'), table_name='inbox')
|
||||||
|
op.drop_index(op.f('ix_inbox_ap_id'), table_name='inbox')
|
||||||
|
op.drop_table('inbox')
|
||||||
|
op.drop_index(op.f('ix_actors_id'), table_name='actors')
|
||||||
|
op.drop_index(op.f('ix_actors_handle'), table_name='actors')
|
||||||
|
op.drop_index(op.f('ix_actors_ap_id'), table_name='actors')
|
||||||
|
op.drop_table('actors')
|
||||||
|
# ### end Alembic commands ###
|
0
app/__init__.py
Normal file
0
app/__init__.py
Normal file
276
app/activitypub.py
Normal file
276
app/activitypub.py
Normal file
|
@ -0,0 +1,276 @@
|
||||||
|
import enum
|
||||||
|
import json
|
||||||
|
import mimetypes
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
import httpx
|
||||||
|
|
||||||
|
from app import config
|
||||||
|
from app.httpsig import auth
|
||||||
|
from app.key import get_pubkey_as_pem
|
||||||
|
|
||||||
|
RawObject = dict[str, Any]
|
||||||
|
AS_CTX = "https://www.w3.org/ns/activitystreams"
|
||||||
|
AS_PUBLIC = "https://www.w3.org/ns/activitystreams#Public"
|
||||||
|
|
||||||
|
ACTOR_TYPES = ["Application", "Group", "Organization", "Person", "Service"]
|
||||||
|
|
||||||
|
|
||||||
|
class VisibilityEnum(str, enum.Enum):
|
||||||
|
PUBLIC = "public"
|
||||||
|
UNLISTED = "unlisted"
|
||||||
|
DIRECT = "direct"
|
||||||
|
|
||||||
|
|
||||||
|
MICROBLOGPUB = {
|
||||||
|
"@context": [
|
||||||
|
"https://www.w3.org/ns/activitystreams",
|
||||||
|
"https://w3id.org/security/v1",
|
||||||
|
{
|
||||||
|
"Hashtag": "as:Hashtag",
|
||||||
|
"PropertyValue": "schema:PropertyValue",
|
||||||
|
"manuallyApprovesFollowers": "as:manuallyApprovesFollowers",
|
||||||
|
"ostatus": "http://ostatus.org#",
|
||||||
|
"schema": "http://schema.org",
|
||||||
|
"sensitive": "as:sensitive",
|
||||||
|
"toot": "http://joinmastodon.org/ns#",
|
||||||
|
"totalItems": "as:totalItems",
|
||||||
|
"value": "schema:value",
|
||||||
|
"Emoji": "toot:Emoji",
|
||||||
|
},
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
DEFAULT_CTX = COLLECTION_CTX = [
|
||||||
|
"https://www.w3.org/ns/activitystreams",
|
||||||
|
"https://w3id.org/security/v1",
|
||||||
|
{
|
||||||
|
# AS ext
|
||||||
|
"Hashtag": "as:Hashtag",
|
||||||
|
"sensitive": "as:sensitive",
|
||||||
|
"manuallyApprovesFollowers": "as:manuallyApprovesFollowers",
|
||||||
|
# toot
|
||||||
|
"toot": "http://joinmastodon.org/ns#",
|
||||||
|
# "featured": "toot:featured",
|
||||||
|
# schema
|
||||||
|
"schema": "http://schema.org#",
|
||||||
|
"PropertyValue": "schema:PropertyValue",
|
||||||
|
"value": "schema:value",
|
||||||
|
},
|
||||||
|
]
|
||||||
|
|
||||||
|
ME = {
|
||||||
|
"@context": DEFAULT_CTX,
|
||||||
|
"type": "Person",
|
||||||
|
"id": config.ID,
|
||||||
|
"following": config.BASE_URL + "/following",
|
||||||
|
"followers": config.BASE_URL + "/followers",
|
||||||
|
# "featured": ID + "/featured",
|
||||||
|
"inbox": config.BASE_URL + "/inbox",
|
||||||
|
"outbox": config.BASE_URL + "/outbox",
|
||||||
|
"preferredUsername": config.USERNAME,
|
||||||
|
"name": config.CONFIG.name,
|
||||||
|
"summary": config.CONFIG.summary,
|
||||||
|
"endpoints": {},
|
||||||
|
"url": config.ID,
|
||||||
|
"manuallyApprovesFollowers": False,
|
||||||
|
"attachment": [],
|
||||||
|
"icon": {
|
||||||
|
"mediaType": mimetypes.guess_type(config.CONFIG.icon_url)[0],
|
||||||
|
"type": "Image",
|
||||||
|
"url": config.CONFIG.icon_url,
|
||||||
|
},
|
||||||
|
"publicKey": {
|
||||||
|
"id": f"{config.ID}#main-key",
|
||||||
|
"owner": config.ID,
|
||||||
|
"publicKeyPem": get_pubkey_as_pem(),
|
||||||
|
},
|
||||||
|
"alsoKnownAs": [],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class NotAnObjectError(Exception):
|
||||||
|
def __init__(self, url: str, resp: httpx.Response | None = None) -> None:
|
||||||
|
message = f"{url} is not an AP activity"
|
||||||
|
super().__init__(message)
|
||||||
|
self.url = url
|
||||||
|
self.resp = resp
|
||||||
|
|
||||||
|
|
||||||
|
def fetch(url: str, params: dict[str, Any] | None = None) -> dict[str, Any]:
|
||||||
|
resp = httpx.get(
|
||||||
|
url,
|
||||||
|
headers={
|
||||||
|
"User-Agent": config.USER_AGENT,
|
||||||
|
"Accept": config.AP_CONTENT_TYPE,
|
||||||
|
},
|
||||||
|
params=params,
|
||||||
|
follow_redirects=True,
|
||||||
|
)
|
||||||
|
resp.raise_for_status()
|
||||||
|
try:
|
||||||
|
return resp.json()
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
raise NotAnObjectError(url, resp)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_collection( # noqa: C901
|
||||||
|
url: str | None = None,
|
||||||
|
payload: RawObject | None = None,
|
||||||
|
level: int = 0,
|
||||||
|
) -> list[RawObject]:
|
||||||
|
"""Resolve/fetch a `Collection`/`OrderedCollection`."""
|
||||||
|
if level > 3:
|
||||||
|
raise ValueError("recursion limit exceeded")
|
||||||
|
|
||||||
|
# Go through all the pages
|
||||||
|
out: list[RawObject] = []
|
||||||
|
if url:
|
||||||
|
payload = fetch(url)
|
||||||
|
if not payload:
|
||||||
|
raise ValueError("must at least prove a payload or an URL")
|
||||||
|
|
||||||
|
ap_type = payload.get("type")
|
||||||
|
if not ap_type:
|
||||||
|
raise ValueError(f"Missing type: {payload=}")
|
||||||
|
|
||||||
|
if level == 0 and ap_type not in ["Collection", "OrderedCollection"]:
|
||||||
|
raise ValueError(f"Unexpected type {ap_type}")
|
||||||
|
|
||||||
|
if payload["type"] in ["Collection", "OrderedCollection"]:
|
||||||
|
if "orderedItems" in payload:
|
||||||
|
return payload["orderedItems"]
|
||||||
|
if "items" in payload:
|
||||||
|
return payload["items"]
|
||||||
|
if "first" in payload:
|
||||||
|
if isinstance(payload["first"], str):
|
||||||
|
out.extend(parse_collection(url=payload["first"], level=level + 1))
|
||||||
|
else:
|
||||||
|
if "orderedItems" in payload["first"]:
|
||||||
|
out.extend(payload["first"]["orderedItems"])
|
||||||
|
if "items" in payload["first"]:
|
||||||
|
out.extend(payload["first"]["items"])
|
||||||
|
n = payload["first"].get("next")
|
||||||
|
if n:
|
||||||
|
out.extend(parse_collection(url=n, level=level + 1))
|
||||||
|
return out
|
||||||
|
|
||||||
|
while payload:
|
||||||
|
if ap_type in ["CollectionPage", "OrderedCollectionPage"]:
|
||||||
|
if "orderedItems" in payload:
|
||||||
|
out.extend(payload["orderedItems"])
|
||||||
|
if "items" in payload:
|
||||||
|
out.extend(payload["items"])
|
||||||
|
n = payload.get("next")
|
||||||
|
if n is None:
|
||||||
|
break
|
||||||
|
payload = fetch(n)
|
||||||
|
else:
|
||||||
|
raise ValueError("unexpected activity type {}".format(payload["type"]))
|
||||||
|
|
||||||
|
return out
|
||||||
|
|
||||||
|
|
||||||
|
def as_list(val: Any | list[Any]) -> list[Any]:
|
||||||
|
if isinstance(val, list):
|
||||||
|
return val
|
||||||
|
|
||||||
|
return [val]
|
||||||
|
|
||||||
|
|
||||||
|
def get_id(val: str | dict[str, Any]) -> str:
|
||||||
|
if isinstance(val, dict):
|
||||||
|
val = val["id"]
|
||||||
|
|
||||||
|
if not isinstance(val, str):
|
||||||
|
raise ValueError(f"Invalid ID type: {val}")
|
||||||
|
|
||||||
|
return val
|
||||||
|
|
||||||
|
|
||||||
|
def object_visibility(ap_activity: RawObject) -> VisibilityEnum:
|
||||||
|
to = as_list(ap_activity.get("to", []))
|
||||||
|
cc = as_list(ap_activity.get("cc", []))
|
||||||
|
if AS_PUBLIC in to:
|
||||||
|
return VisibilityEnum.PUBLIC
|
||||||
|
elif AS_PUBLIC in cc:
|
||||||
|
return VisibilityEnum.UNLISTED
|
||||||
|
else:
|
||||||
|
return VisibilityEnum.DIRECT
|
||||||
|
|
||||||
|
|
||||||
|
def get_actor_id(activity: RawObject) -> str:
|
||||||
|
if activity["type"] in ["Note", "Article", "Video"]:
|
||||||
|
attributed_to = as_list(activity["attributedTo"])
|
||||||
|
return get_id(attributed_to[0])
|
||||||
|
else:
|
||||||
|
return get_id(activity["actor"])
|
||||||
|
|
||||||
|
|
||||||
|
def wrap_object(activity: RawObject) -> RawObject:
|
||||||
|
return {
|
||||||
|
"@context": AS_CTX,
|
||||||
|
"actor": config.ID,
|
||||||
|
"to": activity.get("to", []),
|
||||||
|
"cc": activity.get("cc", []),
|
||||||
|
"id": activity["id"] + "/activity",
|
||||||
|
"object": remove_context(activity),
|
||||||
|
"published": activity["published"],
|
||||||
|
"type": "Create",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def wrap_object_if_needed(raw_object: RawObject) -> RawObject:
|
||||||
|
if raw_object["type"] in ["Note"]:
|
||||||
|
return wrap_object(raw_object)
|
||||||
|
|
||||||
|
return raw_object
|
||||||
|
|
||||||
|
|
||||||
|
def unwrap_activity(activity: RawObject) -> RawObject:
|
||||||
|
# FIXME(ts): other types to unwrap?
|
||||||
|
if activity["type"] == "Create":
|
||||||
|
unwrapped_object = activity["object"]
|
||||||
|
|
||||||
|
# Sanity check, ensure the wrapped object actor matches the activity
|
||||||
|
if get_actor_id(unwrapped_object) != get_actor_id(activity):
|
||||||
|
raise ValueError(
|
||||||
|
f"Unwrapped object actor does not match activity: {activity}"
|
||||||
|
)
|
||||||
|
return unwrapped_object
|
||||||
|
|
||||||
|
return activity
|
||||||
|
|
||||||
|
|
||||||
|
def remove_context(raw_object: RawObject) -> RawObject:
|
||||||
|
if "@context" not in raw_object:
|
||||||
|
return raw_object
|
||||||
|
a = dict(raw_object)
|
||||||
|
del a["@context"]
|
||||||
|
return a
|
||||||
|
|
||||||
|
|
||||||
|
def get(url: str, params: dict[str, Any] | None = None) -> dict[str, Any]:
|
||||||
|
resp = httpx.get(
|
||||||
|
url,
|
||||||
|
headers={"User-Agent": config.USER_AGENT, "Accept": config.AP_CONTENT_TYPE},
|
||||||
|
params=params,
|
||||||
|
follow_redirects=True,
|
||||||
|
auth=auth,
|
||||||
|
)
|
||||||
|
resp.raise_for_status()
|
||||||
|
return resp.json()
|
||||||
|
|
||||||
|
|
||||||
|
def post(url: str, payload: dict[str, Any]) -> httpx.Response:
|
||||||
|
resp = httpx.post(
|
||||||
|
url,
|
||||||
|
headers={
|
||||||
|
"User-Agent": config.USER_AGENT,
|
||||||
|
"Content-Type": config.AP_CONTENT_TYPE,
|
||||||
|
},
|
||||||
|
json=payload,
|
||||||
|
auth=auth,
|
||||||
|
)
|
||||||
|
resp.raise_for_status()
|
||||||
|
return resp
|
190
app/actor.py
Normal file
190
app/actor.py
Normal file
|
@ -0,0 +1,190 @@
|
||||||
|
import typing
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy.orm import joinedload
|
||||||
|
|
||||||
|
from app import activitypub as ap
|
||||||
|
|
||||||
|
if typing.TYPE_CHECKING:
|
||||||
|
from app.models import Actor as ActorModel
|
||||||
|
|
||||||
|
|
||||||
|
def _handle(raw_actor: ap.RawObject) -> str:
|
||||||
|
ap_id = ap.get_id(raw_actor["id"])
|
||||||
|
domain = urlparse(ap_id)
|
||||||
|
if not domain.hostname:
|
||||||
|
raise ValueError(f"Invalid actor ID {ap_id}")
|
||||||
|
|
||||||
|
return f'@{raw_actor["preferredUsername"]}@{domain.hostname}' # type: ignore
|
||||||
|
|
||||||
|
|
||||||
|
class Actor:
|
||||||
|
@property
|
||||||
|
def ap_actor(self) -> ap.RawObject:
|
||||||
|
raise NotImplementedError()
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ap_id(self) -> str:
|
||||||
|
return ap.get_id(self.ap_actor["id"])
|
||||||
|
|
||||||
|
@property
|
||||||
|
def name(self) -> str | None:
|
||||||
|
return self.ap_actor.get("name")
|
||||||
|
|
||||||
|
@property
|
||||||
|
def summary(self) -> str | None:
|
||||||
|
return self.ap_actor.get("summary")
|
||||||
|
|
||||||
|
@property
|
||||||
|
def url(self) -> str | None:
|
||||||
|
return self.ap_actor.get("url") or self.ap_actor["id"]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def preferred_username(self) -> str:
|
||||||
|
return self.ap_actor["preferredUsername"]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def handle(self) -> str:
|
||||||
|
return _handle(self.ap_actor)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ap_type(self) -> str:
|
||||||
|
raise NotImplementedError()
|
||||||
|
|
||||||
|
@property
|
||||||
|
def inbox_url(self) -> str:
|
||||||
|
return self.ap_actor["inbox"]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def shared_inbox_url(self) -> str | None:
|
||||||
|
return self.ap_actor.get("endpoints", {}).get("sharedInbox")
|
||||||
|
|
||||||
|
@property
|
||||||
|
def icon_url(self) -> str | None:
|
||||||
|
return self.ap_actor.get("icon", {}).get("url")
|
||||||
|
|
||||||
|
@property
|
||||||
|
def icon_media_type(self) -> str | None:
|
||||||
|
return self.ap_actor.get("icon", {}).get("mediaType")
|
||||||
|
|
||||||
|
@property
|
||||||
|
def public_key_as_pem(self) -> str:
|
||||||
|
return self.ap_actor["publicKey"]["publicKeyPem"]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def public_key_id(self) -> str:
|
||||||
|
return self.ap_actor["publicKey"]["id"]
|
||||||
|
|
||||||
|
|
||||||
|
class RemoteActor(Actor):
|
||||||
|
def __init__(self, ap_actor: ap.RawObject) -> None:
|
||||||
|
if (ap_type := ap_actor.get("type")) not in ap.ACTOR_TYPES:
|
||||||
|
raise ValueError(f"Unexpected actor type: {ap_type}")
|
||||||
|
|
||||||
|
self._ap_actor = ap_actor
|
||||||
|
self._ap_type = ap_type
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ap_actor(self) -> ap.RawObject:
|
||||||
|
return self._ap_actor
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ap_type(self) -> str:
|
||||||
|
return self._ap_type
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_from_db(self) -> bool:
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
LOCAL_ACTOR = RemoteActor(ap_actor=ap.ME)
|
||||||
|
|
||||||
|
|
||||||
|
def save_actor(db: Session, ap_actor: ap.RawObject) -> "ActorModel":
|
||||||
|
from app import models
|
||||||
|
|
||||||
|
if ap_type := ap_actor.get("type") not in ap.ACTOR_TYPES:
|
||||||
|
raise ValueError(f"Invalid type {ap_type} for actor {ap_actor}")
|
||||||
|
|
||||||
|
actor = models.Actor(
|
||||||
|
ap_id=ap_actor["id"],
|
||||||
|
ap_actor=ap_actor,
|
||||||
|
ap_type=ap_actor["type"],
|
||||||
|
handle=_handle(ap_actor),
|
||||||
|
)
|
||||||
|
db.add(actor)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(actor)
|
||||||
|
return actor
|
||||||
|
|
||||||
|
|
||||||
|
def fetch_actor(db: Session, actor_id: str) -> "ActorModel":
|
||||||
|
from app import models
|
||||||
|
|
||||||
|
existing_actor = (
|
||||||
|
db.query(models.Actor).filter(models.Actor.ap_id == actor_id).one_or_none()
|
||||||
|
)
|
||||||
|
if existing_actor:
|
||||||
|
return existing_actor
|
||||||
|
|
||||||
|
ap_actor = ap.get(actor_id)
|
||||||
|
return save_actor(db, ap_actor)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class ActorMetadata:
|
||||||
|
ap_actor_id: str
|
||||||
|
is_following: bool
|
||||||
|
is_follower: bool
|
||||||
|
is_follow_request_sent: bool
|
||||||
|
outbox_follow_ap_id: str | None
|
||||||
|
inbox_follow_ap_id: str | None
|
||||||
|
|
||||||
|
|
||||||
|
ActorsMetadata = dict[str, ActorMetadata]
|
||||||
|
|
||||||
|
|
||||||
|
def get_actors_metadata(
|
||||||
|
db: Session,
|
||||||
|
actors: list["ActorModel"],
|
||||||
|
) -> ActorsMetadata:
|
||||||
|
from app import models
|
||||||
|
|
||||||
|
ap_actor_ids = [actor.ap_id for actor in actors]
|
||||||
|
followers = {
|
||||||
|
follower.ap_actor_id: follower.inbox_object.ap_id
|
||||||
|
for follower in db.query(models.Follower)
|
||||||
|
.filter(models.Follower.ap_actor_id.in_(ap_actor_ids))
|
||||||
|
.options(joinedload(models.Follower.inbox_object))
|
||||||
|
.all()
|
||||||
|
}
|
||||||
|
following = {
|
||||||
|
following.ap_actor_id
|
||||||
|
for following in db.query(models.Following.ap_actor_id)
|
||||||
|
.filter(models.Following.ap_actor_id.in_(ap_actor_ids))
|
||||||
|
.all()
|
||||||
|
}
|
||||||
|
sent_follow_requests = {
|
||||||
|
follow_req.ap_object["object"]: follow_req.ap_id
|
||||||
|
for follow_req in db.query(
|
||||||
|
models.OutboxObject.ap_object, models.OutboxObject.ap_id
|
||||||
|
)
|
||||||
|
.filter(
|
||||||
|
models.OutboxObject.ap_type == "Follow",
|
||||||
|
models.OutboxObject.undone_by_outbox_object_id.is_(None),
|
||||||
|
)
|
||||||
|
.all()
|
||||||
|
}
|
||||||
|
idx: ActorsMetadata = {}
|
||||||
|
for actor in actors:
|
||||||
|
idx[actor.ap_id] = ActorMetadata(
|
||||||
|
ap_actor_id=actor.ap_id,
|
||||||
|
is_following=actor.ap_id in following,
|
||||||
|
is_follower=actor.ap_id in followers,
|
||||||
|
is_follow_request_sent=actor.ap_id in sent_follow_requests,
|
||||||
|
outbox_follow_ap_id=sent_follow_requests.get(actor.ap_id),
|
||||||
|
inbox_follow_ap_id=followers.get(actor.ap_id),
|
||||||
|
)
|
||||||
|
return idx
|
286
app/admin.py
Normal file
286
app/admin.py
Normal file
|
@ -0,0 +1,286 @@
|
||||||
|
from fastapi import APIRouter
|
||||||
|
from fastapi import Cookie
|
||||||
|
from fastapi import Depends
|
||||||
|
from fastapi import Form
|
||||||
|
from fastapi import Request
|
||||||
|
from fastapi import UploadFile
|
||||||
|
from fastapi.exceptions import HTTPException
|
||||||
|
from fastapi.responses import RedirectResponse
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy.orm import joinedload
|
||||||
|
|
||||||
|
from app import activitypub as ap
|
||||||
|
from app import boxes
|
||||||
|
from app import models
|
||||||
|
from app import templates
|
||||||
|
from app.actor import get_actors_metadata
|
||||||
|
from app.boxes import get_inbox_object_by_ap_id
|
||||||
|
from app.boxes import send_follow
|
||||||
|
from app.config import generate_csrf_token
|
||||||
|
from app.config import session_serializer
|
||||||
|
from app.config import verify_csrf_token
|
||||||
|
from app.config import verify_password
|
||||||
|
from app.database import get_db
|
||||||
|
from app.lookup import lookup
|
||||||
|
|
||||||
|
|
||||||
|
def user_session_or_redirect(
|
||||||
|
request: Request,
|
||||||
|
session: str | None = Cookie(default=None),
|
||||||
|
) -> None:
|
||||||
|
_RedirectToLoginPage = HTTPException(
|
||||||
|
status_code=302,
|
||||||
|
headers={"Location": request.url_for("login")},
|
||||||
|
)
|
||||||
|
|
||||||
|
if not session:
|
||||||
|
raise _RedirectToLoginPage
|
||||||
|
|
||||||
|
try:
|
||||||
|
loaded_session = session_serializer.loads(session, max_age=3600 * 12)
|
||||||
|
except Exception:
|
||||||
|
raise _RedirectToLoginPage
|
||||||
|
|
||||||
|
if not loaded_session.get("is_logged_in"):
|
||||||
|
raise _RedirectToLoginPage
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
router = APIRouter(
|
||||||
|
dependencies=[Depends(user_session_or_redirect)],
|
||||||
|
)
|
||||||
|
unauthenticated_router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/")
|
||||||
|
def admin_index(
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
) -> templates.TemplateResponse:
|
||||||
|
return templates.render_template(db, request, "index.html", {"request": request})
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/lookup")
|
||||||
|
def get_lookup(
|
||||||
|
request: Request,
|
||||||
|
query: str | None = None,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
) -> templates.TemplateResponse:
|
||||||
|
ap_object = None
|
||||||
|
actors_metadata = {}
|
||||||
|
if query:
|
||||||
|
ap_object = lookup(db, query)
|
||||||
|
if ap_object.ap_type in ap.ACTOR_TYPES:
|
||||||
|
actors_metadata = get_actors_metadata(db, [ap_object])
|
||||||
|
else:
|
||||||
|
actors_metadata = get_actors_metadata(db, [ap_object.actor])
|
||||||
|
print(ap_object)
|
||||||
|
return templates.render_template(
|
||||||
|
db,
|
||||||
|
request,
|
||||||
|
"lookup.html",
|
||||||
|
{
|
||||||
|
"query": query,
|
||||||
|
"ap_object": ap_object,
|
||||||
|
"actors_metadata": actors_metadata,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/new")
|
||||||
|
def admin_new(
|
||||||
|
request: Request,
|
||||||
|
query: str | None = None,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
) -> templates.TemplateResponse:
|
||||||
|
return templates.render_template(
|
||||||
|
db,
|
||||||
|
request,
|
||||||
|
"admin_new.html",
|
||||||
|
{},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/stream")
|
||||||
|
def stream(
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
) -> templates.TemplateResponse:
|
||||||
|
stream = (
|
||||||
|
db.query(models.InboxObject)
|
||||||
|
.filter(
|
||||||
|
models.InboxObject.ap_type.in_(["Note", "Article", "Video", "Announce"]),
|
||||||
|
models.InboxObject.is_hidden_from_stream.is_(False),
|
||||||
|
models.InboxObject.undone_by_inbox_object_id.is_(None),
|
||||||
|
)
|
||||||
|
.options(
|
||||||
|
# joinedload(models.InboxObject.relates_to_inbox_object),
|
||||||
|
joinedload(models.InboxObject.relates_to_outbox_object),
|
||||||
|
)
|
||||||
|
.order_by(models.InboxObject.ap_published_at.desc())
|
||||||
|
.limit(20)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
return templates.render_template(
|
||||||
|
db,
|
||||||
|
request,
|
||||||
|
"admin_stream.html",
|
||||||
|
{
|
||||||
|
"stream": stream,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/notifications")
|
||||||
|
def get_notifications(
|
||||||
|
request: Request, db: Session = Depends(get_db)
|
||||||
|
) -> templates.TemplateResponse:
|
||||||
|
notifications = (
|
||||||
|
db.query(models.Notification)
|
||||||
|
.options(
|
||||||
|
joinedload(models.Notification.actor),
|
||||||
|
joinedload(models.Notification.inbox_object),
|
||||||
|
joinedload(models.Notification.outbox_object),
|
||||||
|
)
|
||||||
|
.order_by(models.Notification.created_at.desc())
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
actors_metadata = get_actors_metadata(
|
||||||
|
db, [notif.actor for notif in notifications if notif.actor]
|
||||||
|
)
|
||||||
|
|
||||||
|
for notif in notifications:
|
||||||
|
notif.is_new = False
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
return templates.render_template(
|
||||||
|
db,
|
||||||
|
request,
|
||||||
|
"notifications.html",
|
||||||
|
{
|
||||||
|
"notifications": notifications,
|
||||||
|
"actors_metadata": actors_metadata,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/actions/follow")
|
||||||
|
def admin_actions_follow(
|
||||||
|
request: Request,
|
||||||
|
ap_actor_id: str = Form(),
|
||||||
|
redirect_url: str = Form(),
|
||||||
|
csrf_check: None = Depends(verify_csrf_token),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
) -> RedirectResponse:
|
||||||
|
print(f"Following {ap_actor_id}")
|
||||||
|
send_follow(db, ap_actor_id)
|
||||||
|
return RedirectResponse(redirect_url, status_code=302)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/actions/like")
|
||||||
|
def admin_actions_like(
|
||||||
|
request: Request,
|
||||||
|
ap_object_id: str = Form(),
|
||||||
|
redirect_url: str = Form(),
|
||||||
|
csrf_check: None = Depends(verify_csrf_token),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
) -> RedirectResponse:
|
||||||
|
boxes.send_like(db, ap_object_id)
|
||||||
|
return RedirectResponse(redirect_url, status_code=302)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/actions/undo")
|
||||||
|
def admin_actions_undo(
|
||||||
|
request: Request,
|
||||||
|
ap_object_id: str = Form(),
|
||||||
|
redirect_url: str = Form(),
|
||||||
|
csrf_check: None = Depends(verify_csrf_token),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
) -> RedirectResponse:
|
||||||
|
boxes.send_undo(db, ap_object_id)
|
||||||
|
return RedirectResponse(redirect_url, status_code=302)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/actions/announce")
|
||||||
|
def admin_actions_announce(
|
||||||
|
request: Request,
|
||||||
|
ap_object_id: str = Form(),
|
||||||
|
redirect_url: str = Form(),
|
||||||
|
csrf_check: None = Depends(verify_csrf_token),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
) -> RedirectResponse:
|
||||||
|
boxes.send_announce(db, ap_object_id)
|
||||||
|
return RedirectResponse(redirect_url, status_code=302)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/actions/bookmark")
|
||||||
|
def admin_actions_bookmark(
|
||||||
|
request: Request,
|
||||||
|
ap_object_id: str = Form(),
|
||||||
|
redirect_url: str = Form(),
|
||||||
|
csrf_check: None = Depends(verify_csrf_token),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
) -> RedirectResponse:
|
||||||
|
inbox_object = get_inbox_object_by_ap_id(db, ap_object_id)
|
||||||
|
if not inbox_object:
|
||||||
|
raise ValueError("Should never happen")
|
||||||
|
inbox_object.is_bookmarked = True
|
||||||
|
db.commit()
|
||||||
|
return RedirectResponse(redirect_url, status_code=302)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/actions/new")
|
||||||
|
async def admin_actions_new(
|
||||||
|
request: Request,
|
||||||
|
files: list[UploadFile],
|
||||||
|
content: str = Form(),
|
||||||
|
redirect_url: str = Form(),
|
||||||
|
csrf_check: None = Depends(verify_csrf_token),
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
) -> RedirectResponse:
|
||||||
|
# XXX: for some reason, no files restuls in an empty single file
|
||||||
|
if len(files) >= 1 and files[0].filename:
|
||||||
|
print("Got files")
|
||||||
|
public_id = boxes.send_create(db, content)
|
||||||
|
return RedirectResponse(
|
||||||
|
request.url_for("outbox_by_public_id", public_id=public_id),
|
||||||
|
status_code=302,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@unauthenticated_router.get("/login")
|
||||||
|
def login(
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
) -> templates.TemplateResponse:
|
||||||
|
return templates.render_template(
|
||||||
|
db,
|
||||||
|
request,
|
||||||
|
"login.html",
|
||||||
|
{"csrf_token": generate_csrf_token()},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@unauthenticated_router.post("/login")
|
||||||
|
def login_validation(
|
||||||
|
request: Request,
|
||||||
|
password: str = Form(),
|
||||||
|
csrf_check: None = Depends(verify_csrf_token),
|
||||||
|
) -> RedirectResponse:
|
||||||
|
if not verify_password(password):
|
||||||
|
raise HTTPException(status_code=401)
|
||||||
|
|
||||||
|
resp = RedirectResponse("/admin", status_code=302)
|
||||||
|
resp.set_cookie("session", session_serializer.dumps({"is_logged_in": True})) # type: ignore # noqa: E501
|
||||||
|
|
||||||
|
return resp
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/logout")
|
||||||
|
def logout(
|
||||||
|
request: Request,
|
||||||
|
) -> RedirectResponse:
|
||||||
|
resp = RedirectResponse(request.url_for("index"), status_code=302)
|
||||||
|
resp.set_cookie("session", session_serializer.dumps({"is_logged_in": False})) # type: ignore # noqa: E501
|
||||||
|
return resp
|
183
app/ap_object.py
Normal file
183
app/ap_object.py
Normal file
|
@ -0,0 +1,183 @@
|
||||||
|
import hashlib
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
import pydantic
|
||||||
|
from dateutil.parser import isoparse
|
||||||
|
from markdown import markdown
|
||||||
|
|
||||||
|
from app import activitypub as ap
|
||||||
|
from app import opengraph
|
||||||
|
from app.actor import LOCAL_ACTOR
|
||||||
|
from app.actor import Actor
|
||||||
|
from app.actor import RemoteActor
|
||||||
|
|
||||||
|
|
||||||
|
class Object:
|
||||||
|
@property
|
||||||
|
def is_from_db(self) -> bool:
|
||||||
|
return False
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ap_type(self) -> str:
|
||||||
|
return self.ap_object["type"]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ap_object(self) -> ap.RawObject:
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ap_id(self) -> str:
|
||||||
|
return ap.get_id(self.ap_object["id"])
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ap_actor_id(self) -> str:
|
||||||
|
return ap.get_actor_id(self.ap_object)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ap_published_at(self) -> datetime | None:
|
||||||
|
# TODO: default to None? or now()?
|
||||||
|
if "published" in self.ap_object:
|
||||||
|
return isoparse(self.ap_object["published"])
|
||||||
|
elif "created" in self.ap_object:
|
||||||
|
return isoparse(self.ap_object["created"])
|
||||||
|
return None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def actor(self) -> Actor:
|
||||||
|
raise NotImplementedError()
|
||||||
|
|
||||||
|
@property
|
||||||
|
def visibility(self) -> ap.VisibilityEnum:
|
||||||
|
return ap.object_visibility(self.ap_object)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def context(self) -> str | None:
|
||||||
|
return self.ap_object.get("context")
|
||||||
|
|
||||||
|
@property
|
||||||
|
def sensitive(self) -> bool:
|
||||||
|
return self.ap_object.get("sensitive", False)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def attachments(self) -> list["Attachment"]:
|
||||||
|
attachments = [
|
||||||
|
Attachment.parse_obj(obj) for obj in self.ap_object.get("attachment", [])
|
||||||
|
]
|
||||||
|
|
||||||
|
# Also add any video Link (for PeerTube compat)
|
||||||
|
if self.ap_type == "Video":
|
||||||
|
for link in ap.as_list(self.ap_object.get("url", [])):
|
||||||
|
if (isinstance(link, dict)) and link.get("type") == "Link":
|
||||||
|
if link.get("mediaType", "").startswith("video"):
|
||||||
|
attachments.append(
|
||||||
|
Attachment(
|
||||||
|
type="Video",
|
||||||
|
mediaType=link["mediaType"],
|
||||||
|
url=link["href"],
|
||||||
|
)
|
||||||
|
)
|
||||||
|
break
|
||||||
|
|
||||||
|
return attachments
|
||||||
|
|
||||||
|
@property
|
||||||
|
def url(self) -> str | None:
|
||||||
|
obj_url = self.ap_object.get("url")
|
||||||
|
if isinstance(obj_url, str):
|
||||||
|
return obj_url
|
||||||
|
elif obj_url:
|
||||||
|
for u in ap.as_list(obj_url):
|
||||||
|
if u["mediaType"] == "text/html":
|
||||||
|
return u["href"]
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def content(self) -> str | None:
|
||||||
|
content = self.ap_object.get("content")
|
||||||
|
if not content:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# PeerTube returns the content as markdown
|
||||||
|
if self.ap_object.get("mediaType") == "text/markdown":
|
||||||
|
return markdown(content, extensions=["mdx_linkify"])
|
||||||
|
|
||||||
|
return content
|
||||||
|
|
||||||
|
@property
|
||||||
|
def permalink_id(self) -> str:
|
||||||
|
return (
|
||||||
|
"permalink-"
|
||||||
|
+ hashlib.md5(
|
||||||
|
self.ap_id.encode(),
|
||||||
|
usedforsecurity=False,
|
||||||
|
).hexdigest()
|
||||||
|
)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def activity_object_ap_id(self) -> str | None:
|
||||||
|
if "object" in self.ap_object:
|
||||||
|
return ap.get_id(self.ap_object["object"])
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def in_reply_to(self) -> str | None:
|
||||||
|
return self.ap_object.get("inReplyTo")
|
||||||
|
|
||||||
|
|
||||||
|
def _to_camel(string: str) -> str:
|
||||||
|
cased = "".join(word.capitalize() for word in string.split("_"))
|
||||||
|
return cased[0:1].lower() + cased[1:]
|
||||||
|
|
||||||
|
|
||||||
|
class BaseModel(pydantic.BaseModel):
|
||||||
|
class Config:
|
||||||
|
alias_generator = _to_camel
|
||||||
|
|
||||||
|
|
||||||
|
class Attachment(BaseModel):
|
||||||
|
type: str
|
||||||
|
media_type: str
|
||||||
|
name: str | None
|
||||||
|
url: str
|
||||||
|
|
||||||
|
|
||||||
|
class RemoteObject(Object):
|
||||||
|
def __init__(self, raw_object: ap.RawObject, actor: Actor | None = None):
|
||||||
|
self._raw_object = raw_object
|
||||||
|
self._actor: Actor
|
||||||
|
|
||||||
|
# Pre-fetch the actor
|
||||||
|
actor_id = ap.get_actor_id(raw_object)
|
||||||
|
if actor_id == LOCAL_ACTOR.ap_id:
|
||||||
|
self._actor = LOCAL_ACTOR
|
||||||
|
elif actor:
|
||||||
|
if actor.ap_id != actor_id:
|
||||||
|
raise ValueError(
|
||||||
|
f"Invalid actor, got {actor.ap_id}, " f"expected {actor_id}"
|
||||||
|
)
|
||||||
|
self._actor = actor
|
||||||
|
else:
|
||||||
|
self._actor = RemoteActor(
|
||||||
|
ap_actor=ap.fetch(ap.get_actor_id(raw_object)),
|
||||||
|
)
|
||||||
|
|
||||||
|
self._og_meta = None
|
||||||
|
if self.ap_type == "Note":
|
||||||
|
self._og_meta = opengraph.og_meta_from_note(self._raw_object)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def og_meta(self) -> list[dict[str, Any]] | None:
|
||||||
|
if self._og_meta:
|
||||||
|
return [og_meta.dict() for og_meta in self._og_meta]
|
||||||
|
return None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def ap_object(self) -> ap.RawObject:
|
||||||
|
return self._raw_object
|
||||||
|
|
||||||
|
@property
|
||||||
|
def actor(self) -> Actor:
|
||||||
|
return self._actor
|
684
app/boxes.py
Normal file
684
app/boxes.py
Normal file
|
@ -0,0 +1,684 @@
|
||||||
|
"""Actions related to the AP inbox/outbox."""
|
||||||
|
import uuid
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
|
||||||
|
import httpx
|
||||||
|
from dateutil.parser import isoparse
|
||||||
|
from loguru import logger
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy.orm import joinedload
|
||||||
|
|
||||||
|
from app import activitypub as ap
|
||||||
|
from app import config
|
||||||
|
from app import models
|
||||||
|
from app.actor import LOCAL_ACTOR
|
||||||
|
from app.actor import RemoteActor
|
||||||
|
from app.actor import fetch_actor
|
||||||
|
from app.actor import save_actor
|
||||||
|
from app.ap_object import RemoteObject
|
||||||
|
from app.config import BASE_URL
|
||||||
|
from app.config import ID
|
||||||
|
from app.database import now
|
||||||
|
from app.process_outgoing_activities import new_outgoing_activity
|
||||||
|
from app.source import markdownify
|
||||||
|
|
||||||
|
|
||||||
|
def allocate_outbox_id() -> str:
|
||||||
|
return uuid.uuid4().hex
|
||||||
|
|
||||||
|
|
||||||
|
def outbox_object_id(outbox_id) -> str:
|
||||||
|
return f"{BASE_URL}/o/{outbox_id}"
|
||||||
|
|
||||||
|
|
||||||
|
def save_outbox_object(
|
||||||
|
db: Session,
|
||||||
|
public_id: str,
|
||||||
|
raw_object: ap.RawObject,
|
||||||
|
relates_to_inbox_object_id: int | None = None,
|
||||||
|
relates_to_outbox_object_id: int | None = None,
|
||||||
|
source: str | None = None,
|
||||||
|
) -> models.OutboxObject:
|
||||||
|
ra = RemoteObject(raw_object)
|
||||||
|
|
||||||
|
outbox_object = models.OutboxObject(
|
||||||
|
public_id=public_id,
|
||||||
|
ap_type=ra.ap_type,
|
||||||
|
ap_id=ra.ap_id,
|
||||||
|
ap_context=ra.context,
|
||||||
|
ap_object=ra.ap_object,
|
||||||
|
visibility=ra.visibility,
|
||||||
|
og_meta=ra.og_meta,
|
||||||
|
relates_to_inbox_object_id=relates_to_inbox_object_id,
|
||||||
|
relates_to_outbox_object_id=relates_to_outbox_object_id,
|
||||||
|
activity_object_ap_id=ra.activity_object_ap_id,
|
||||||
|
is_hidden_from_homepage=True if ra.in_reply_to else False,
|
||||||
|
)
|
||||||
|
db.add(outbox_object)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(outbox_object)
|
||||||
|
|
||||||
|
return outbox_object
|
||||||
|
|
||||||
|
|
||||||
|
def send_like(db: Session, ap_object_id: str) -> None:
|
||||||
|
inbox_object = get_inbox_object_by_ap_id(db, ap_object_id)
|
||||||
|
if not inbox_object:
|
||||||
|
raise ValueError(f"{ap_object_id} not found in the inbox")
|
||||||
|
|
||||||
|
like_id = allocate_outbox_id()
|
||||||
|
like = {
|
||||||
|
"@context": ap.AS_CTX,
|
||||||
|
"id": outbox_object_id(like_id),
|
||||||
|
"type": "Like",
|
||||||
|
"actor": ID,
|
||||||
|
"object": ap_object_id,
|
||||||
|
}
|
||||||
|
outbox_object = save_outbox_object(
|
||||||
|
db, like_id, like, relates_to_inbox_object_id=inbox_object.id
|
||||||
|
)
|
||||||
|
if not outbox_object.id:
|
||||||
|
raise ValueError("Should never happen")
|
||||||
|
|
||||||
|
inbox_object.liked_via_outbox_object_ap_id = outbox_object.ap_id
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
new_outgoing_activity(db, inbox_object.actor.inbox_url, outbox_object.id)
|
||||||
|
|
||||||
|
|
||||||
|
def send_announce(db: Session, ap_object_id: str) -> None:
|
||||||
|
inbox_object = get_inbox_object_by_ap_id(db, ap_object_id)
|
||||||
|
if not inbox_object:
|
||||||
|
raise ValueError(f"{ap_object_id} not found in the inbox")
|
||||||
|
|
||||||
|
announce_id = allocate_outbox_id()
|
||||||
|
announce = {
|
||||||
|
"@context": ap.AS_CTX,
|
||||||
|
"id": outbox_object_id(announce_id),
|
||||||
|
"type": "Announce",
|
||||||
|
"actor": ID,
|
||||||
|
"object": ap_object_id,
|
||||||
|
"to": [ap.AS_PUBLIC],
|
||||||
|
"cc": [
|
||||||
|
f"{BASE_URL}/followers",
|
||||||
|
inbox_object.ap_actor_id,
|
||||||
|
],
|
||||||
|
}
|
||||||
|
outbox_object = save_outbox_object(
|
||||||
|
db, announce_id, announce, relates_to_inbox_object_id=inbox_object.id
|
||||||
|
)
|
||||||
|
if not outbox_object.id:
|
||||||
|
raise ValueError("Should never happen")
|
||||||
|
|
||||||
|
inbox_object.announced_via_outbox_object_ap_id = outbox_object.ap_id
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
recipients = _compute_recipients(db, announce)
|
||||||
|
for rcp in recipients:
|
||||||
|
new_outgoing_activity(db, rcp, outbox_object.id)
|
||||||
|
|
||||||
|
|
||||||
|
def send_follow(db: Session, ap_actor_id: str) -> None:
|
||||||
|
actor = fetch_actor(db, ap_actor_id)
|
||||||
|
|
||||||
|
follow_id = allocate_outbox_id()
|
||||||
|
follow = {
|
||||||
|
"@context": ap.AS_CTX,
|
||||||
|
"id": outbox_object_id(follow_id),
|
||||||
|
"type": "Follow",
|
||||||
|
"actor": ID,
|
||||||
|
"object": ap_actor_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
outbox_object = save_outbox_object(db, follow_id, follow)
|
||||||
|
if not outbox_object.id:
|
||||||
|
raise ValueError("Should never happen")
|
||||||
|
|
||||||
|
new_outgoing_activity(db, actor.inbox_url, outbox_object.id)
|
||||||
|
|
||||||
|
|
||||||
|
def send_undo(db: Session, ap_object_id: str) -> None:
|
||||||
|
outbox_object_to_undo = get_outbox_object_by_ap_id(db, ap_object_id)
|
||||||
|
if not outbox_object_to_undo:
|
||||||
|
raise ValueError(f"{ap_object_id} not found in the outbox")
|
||||||
|
|
||||||
|
if outbox_object_to_undo.ap_type not in ["Follow", "Like", "Announce"]:
|
||||||
|
raise ValueError(
|
||||||
|
f"Cannot build Undo for {outbox_object_to_undo.ap_type} activity"
|
||||||
|
)
|
||||||
|
|
||||||
|
undo_id = allocate_outbox_id()
|
||||||
|
undo = {
|
||||||
|
"@context": ap.AS_CTX,
|
||||||
|
"id": outbox_object_id(undo_id),
|
||||||
|
"type": "Undo",
|
||||||
|
"actor": ID,
|
||||||
|
"object": ap.remove_context(outbox_object_to_undo.ap_object),
|
||||||
|
}
|
||||||
|
|
||||||
|
outbox_object = save_outbox_object(
|
||||||
|
db,
|
||||||
|
undo_id,
|
||||||
|
undo,
|
||||||
|
relates_to_outbox_object_id=outbox_object_to_undo.id,
|
||||||
|
)
|
||||||
|
if not outbox_object.id:
|
||||||
|
raise ValueError("Should never happen")
|
||||||
|
|
||||||
|
outbox_object_to_undo.undone_by_outbox_object_id = outbox_object.id
|
||||||
|
|
||||||
|
if outbox_object_to_undo.ap_type == "Follow":
|
||||||
|
if not outbox_object_to_undo.activity_object_ap_id:
|
||||||
|
raise ValueError("Should never happen")
|
||||||
|
followed_actor = fetch_actor(db, outbox_object_to_undo.activity_object_ap_id)
|
||||||
|
new_outgoing_activity(
|
||||||
|
db,
|
||||||
|
followed_actor.inbox_url,
|
||||||
|
outbox_object.id,
|
||||||
|
)
|
||||||
|
# Also remove the follow from the following collection
|
||||||
|
db.query(models.Following).filter(
|
||||||
|
models.Following.ap_actor_id == followed_actor.ap_id
|
||||||
|
).delete()
|
||||||
|
db.commit()
|
||||||
|
elif outbox_object_to_undo.ap_type == "Like":
|
||||||
|
liked_object_ap_id = outbox_object_to_undo.activity_object_ap_id
|
||||||
|
if not liked_object_ap_id:
|
||||||
|
raise ValueError("Should never happen")
|
||||||
|
liked_object = get_inbox_object_by_ap_id(db, liked_object_ap_id)
|
||||||
|
if not liked_object:
|
||||||
|
raise ValueError(f"Cannot find liked object {liked_object_ap_id}")
|
||||||
|
liked_object.liked_via_outbox_object_ap_id = None
|
||||||
|
|
||||||
|
# Send the Undo to the liked object's actor
|
||||||
|
new_outgoing_activity(
|
||||||
|
db,
|
||||||
|
liked_object.actor.inbox_url, # type: ignore
|
||||||
|
outbox_object.id,
|
||||||
|
)
|
||||||
|
elif outbox_object_to_undo.ap_type == "Announce":
|
||||||
|
announced_object_ap_id = outbox_object_to_undo.activity_object_ap_id
|
||||||
|
if not announced_object_ap_id:
|
||||||
|
raise ValueError("Should never happen")
|
||||||
|
announced_object = get_inbox_object_by_ap_id(db, announced_object_ap_id)
|
||||||
|
if not announced_object:
|
||||||
|
raise ValueError(f"Cannot find announced object {announced_object_ap_id}")
|
||||||
|
announced_object.announced_via_outbox_object_ap_id = None
|
||||||
|
|
||||||
|
# Send the Undo to the original recipients
|
||||||
|
recipients = _compute_recipients(db, outbox_object.ap_object)
|
||||||
|
for rcp in recipients:
|
||||||
|
new_outgoing_activity(db, rcp, outbox_object.id)
|
||||||
|
else:
|
||||||
|
raise ValueError("Should never happen")
|
||||||
|
|
||||||
|
|
||||||
|
def send_create(db: Session, source: str) -> str:
|
||||||
|
note_id = allocate_outbox_id()
|
||||||
|
published = now().replace(microsecond=0).isoformat().replace("+00:00", "Z")
|
||||||
|
context = f"{ID}/contexts/" + uuid.uuid4().hex
|
||||||
|
content, tags = markdownify(db, source)
|
||||||
|
note = {
|
||||||
|
"@context": ap.AS_CTX,
|
||||||
|
"type": "Note",
|
||||||
|
"id": outbox_object_id(note_id),
|
||||||
|
"attributedTo": ID,
|
||||||
|
"content": content,
|
||||||
|
"to": [ap.AS_PUBLIC],
|
||||||
|
"cc": [f"{BASE_URL}/followers"],
|
||||||
|
"published": published,
|
||||||
|
"context": context,
|
||||||
|
"conversation": context,
|
||||||
|
"url": outbox_object_id(note_id),
|
||||||
|
"tag": tags,
|
||||||
|
"summary": None,
|
||||||
|
"inReplyTo": None,
|
||||||
|
"sensitive": False,
|
||||||
|
}
|
||||||
|
outbox_object = save_outbox_object(db, note_id, note, source=source)
|
||||||
|
if not outbox_object.id:
|
||||||
|
raise ValueError("Should never happen")
|
||||||
|
|
||||||
|
for tag in tags:
|
||||||
|
if tag["type"] == "Hashtag":
|
||||||
|
tagged_object = models.TaggedOutboxObject(
|
||||||
|
tag=tag["name"][1:],
|
||||||
|
outbox_object_id=outbox_object.id,
|
||||||
|
)
|
||||||
|
db.add(tagged_object)
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
recipients = _compute_recipients(db, note)
|
||||||
|
for rcp in recipients:
|
||||||
|
new_outgoing_activity(db, rcp, outbox_object.id)
|
||||||
|
|
||||||
|
return note_id
|
||||||
|
|
||||||
|
|
||||||
|
def _compute_recipients(db: Session, ap_object: ap.RawObject) -> set[str]:
|
||||||
|
_recipients = []
|
||||||
|
for field in ["to", "cc", "bto", "bcc"]:
|
||||||
|
if field in ap_object:
|
||||||
|
_recipients.extend(ap.as_list(ap_object[field]))
|
||||||
|
|
||||||
|
recipients = set()
|
||||||
|
for r in _recipients:
|
||||||
|
if r in [ap.AS_PUBLIC, ID]:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# If we got a local collection, assume it's a collection of actors
|
||||||
|
if r.startswith(BASE_URL):
|
||||||
|
for raw_actor in fetch_collection(db, r):
|
||||||
|
actor = RemoteActor(raw_actor)
|
||||||
|
recipients.add(actor.shared_inbox_url or actor.inbox_url)
|
||||||
|
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Is it a known actor?
|
||||||
|
known_actor = (
|
||||||
|
db.query(models.Actor).filter(models.Actor.ap_id == r).one_or_none()
|
||||||
|
)
|
||||||
|
if known_actor:
|
||||||
|
recipients.add(known_actor.shared_inbox_url or actor.inbox_url)
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Fetch the object
|
||||||
|
raw_object = ap.fetch(r)
|
||||||
|
if raw_object.get("type") in ap.ACTOR_TYPES:
|
||||||
|
saved_actor = save_actor(db, raw_object)
|
||||||
|
recipients.add(saved_actor.shared_inbox_url or saved_actor.inbox_url)
|
||||||
|
else:
|
||||||
|
# Assume it's a collection of actors
|
||||||
|
for raw_actor in ap.parse_collection(payload=raw_object):
|
||||||
|
actor = RemoteActor(raw_actor)
|
||||||
|
recipients.add(actor.shared_inbox_url or actor.inbox_url)
|
||||||
|
|
||||||
|
return recipients
|
||||||
|
|
||||||
|
|
||||||
|
def get_inbox_object_by_ap_id(db: Session, ap_id: str) -> models.InboxObject | None:
|
||||||
|
return (
|
||||||
|
db.query(models.InboxObject)
|
||||||
|
.filter(models.InboxObject.ap_id == ap_id)
|
||||||
|
.one_or_none()
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def get_outbox_object_by_ap_id(db: Session, ap_id: str) -> models.OutboxObject | None:
|
||||||
|
return (
|
||||||
|
db.query(models.OutboxObject)
|
||||||
|
.filter(models.OutboxObject.ap_id == ap_id)
|
||||||
|
.one_or_none()
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_delete_activity(
|
||||||
|
db: Session,
|
||||||
|
from_actor: models.Actor,
|
||||||
|
ap_object_to_delete: models.InboxObject,
|
||||||
|
) -> None:
|
||||||
|
if from_actor.ap_id != ap_object_to_delete.actor.ap_id:
|
||||||
|
logger.warning(
|
||||||
|
"Actor mismatch between the activity and the object: "
|
||||||
|
f"{from_actor.ap_id}/{ap_object_to_delete.actor.ap_id}"
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
# TODO(ts): do we need to delete related activities? should we keep
|
||||||
|
# bookmarked objects with a deleted flag?
|
||||||
|
logger.info(f"Deleting {ap_object_to_delete.ap_type}/{ap_object_to_delete.ap_id}")
|
||||||
|
db.delete(ap_object_to_delete)
|
||||||
|
db.flush()
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_follow_follow_activity(
|
||||||
|
db: Session,
|
||||||
|
from_actor: models.Actor,
|
||||||
|
inbox_object: models.InboxObject,
|
||||||
|
) -> None:
|
||||||
|
follower = models.Follower(
|
||||||
|
actor_id=from_actor.id,
|
||||||
|
inbox_object_id=inbox_object.id,
|
||||||
|
ap_actor_id=from_actor.ap_id,
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
db.add(follower)
|
||||||
|
db.flush()
|
||||||
|
except IntegrityError:
|
||||||
|
pass # TODO update the existing followe
|
||||||
|
|
||||||
|
# Reply with an Accept
|
||||||
|
reply_id = allocate_outbox_id()
|
||||||
|
reply = {
|
||||||
|
"@context": ap.AS_CTX,
|
||||||
|
"id": outbox_object_id(reply_id),
|
||||||
|
"type": "Accept",
|
||||||
|
"actor": ID,
|
||||||
|
"object": inbox_object.ap_id,
|
||||||
|
}
|
||||||
|
outbox_activity = save_outbox_object(db, reply_id, reply)
|
||||||
|
if not outbox_activity.id:
|
||||||
|
raise ValueError("Should never happen")
|
||||||
|
new_outgoing_activity(db, from_actor.inbox_url, outbox_activity.id)
|
||||||
|
|
||||||
|
notif = models.Notification(
|
||||||
|
notification_type=models.NotificationType.NEW_FOLLOWER,
|
||||||
|
actor_id=from_actor.id,
|
||||||
|
)
|
||||||
|
db.add(notif)
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_undo_activity(
|
||||||
|
db: Session,
|
||||||
|
from_actor: models.Actor,
|
||||||
|
undo_activity: models.InboxObject,
|
||||||
|
ap_activity_to_undo: models.InboxObject,
|
||||||
|
) -> None:
|
||||||
|
if from_actor.ap_id != ap_activity_to_undo.actor.ap_id:
|
||||||
|
logger.warning(
|
||||||
|
"Actor mismatch between the activity and the object: "
|
||||||
|
f"{from_actor.ap_id}/{ap_activity_to_undo.actor.ap_id}"
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
ap_activity_to_undo.undone_by_inbox_object_id = undo_activity.id
|
||||||
|
|
||||||
|
if ap_activity_to_undo.ap_type == "Follow":
|
||||||
|
logger.info(f"Undo follow from {from_actor.ap_id}")
|
||||||
|
db.query(models.Follower).filter(
|
||||||
|
models.Follower.inbox_object_id == ap_activity_to_undo.id
|
||||||
|
).delete()
|
||||||
|
notif = models.Notification(
|
||||||
|
notification_type=models.NotificationType.UNFOLLOW,
|
||||||
|
actor_id=from_actor.id,
|
||||||
|
)
|
||||||
|
db.add(notif)
|
||||||
|
|
||||||
|
elif ap_activity_to_undo.ap_type == "Like":
|
||||||
|
if not ap_activity_to_undo.activity_object_ap_id:
|
||||||
|
raise ValueError("Like without object")
|
||||||
|
liked_obj = get_outbox_object_by_ap_id(
|
||||||
|
db,
|
||||||
|
ap_activity_to_undo.activity_object_ap_id,
|
||||||
|
)
|
||||||
|
if not liked_obj:
|
||||||
|
logger.warning(
|
||||||
|
"Cannot find liked object: "
|
||||||
|
f"{ap_activity_to_undo.activity_object_ap_id}"
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
liked_obj.likes_count = models.OutboxObject.likes_count - 1
|
||||||
|
notif = models.Notification(
|
||||||
|
notification_type=models.NotificationType.UNDO_LIKE,
|
||||||
|
actor_id=from_actor.id,
|
||||||
|
outbox_object_id=liked_obj.id,
|
||||||
|
inbox_object_id=ap_activity_to_undo.id,
|
||||||
|
)
|
||||||
|
db.add(notif)
|
||||||
|
|
||||||
|
elif ap_activity_to_undo.ap_type == "Announce":
|
||||||
|
if not ap_activity_to_undo.activity_object_ap_id:
|
||||||
|
raise ValueError("Announce witout object")
|
||||||
|
announced_obj_ap_id = ap_activity_to_undo.activity_object_ap_id
|
||||||
|
logger.info(
|
||||||
|
f"Undo for announce {ap_activity_to_undo.ap_id}/{announced_obj_ap_id}"
|
||||||
|
)
|
||||||
|
if announced_obj_ap_id.startswith(BASE_URL):
|
||||||
|
announced_obj_from_outbox = get_outbox_object_by_ap_id(
|
||||||
|
db, announced_obj_ap_id
|
||||||
|
)
|
||||||
|
if announced_obj_from_outbox:
|
||||||
|
logger.info("Found in the oubox")
|
||||||
|
announced_obj_from_outbox.announces_count = (
|
||||||
|
models.OutboxObject.announces_count - 1
|
||||||
|
)
|
||||||
|
notif = models.Notification(
|
||||||
|
notification_type=models.NotificationType.UNDO_ANNOUNCE,
|
||||||
|
actor_id=from_actor.id,
|
||||||
|
outbox_object_id=announced_obj_from_outbox.id,
|
||||||
|
inbox_object_id=ap_activity_to_undo.id,
|
||||||
|
)
|
||||||
|
db.add(notif)
|
||||||
|
|
||||||
|
# FIXME(ts): what to do with ap_activity_to_undo? flag? delete?
|
||||||
|
else:
|
||||||
|
logger.warning(f"Don't know how to undo {ap_activity_to_undo.ap_type} activity")
|
||||||
|
|
||||||
|
# commit will be perfomed in save_to_inbox
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_create_activity(
|
||||||
|
db: Session,
|
||||||
|
from_actor: models.Actor,
|
||||||
|
created_object: models.InboxObject,
|
||||||
|
) -> None:
|
||||||
|
logger.info("Processing Create activity")
|
||||||
|
tags = created_object.ap_object.get("tag")
|
||||||
|
|
||||||
|
if not tags:
|
||||||
|
logger.info("No tags to process")
|
||||||
|
return None
|
||||||
|
|
||||||
|
if not isinstance(tags, list):
|
||||||
|
logger.info(f"Invalid tags: {tags}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
for tag in tags:
|
||||||
|
if tag.get("name") == LOCAL_ACTOR.handle or tag.get("href") == LOCAL_ACTOR.url:
|
||||||
|
notif = models.Notification(
|
||||||
|
notification_type=models.NotificationType.MENTION,
|
||||||
|
actor_id=from_actor.id,
|
||||||
|
inbox_object_id=created_object.id,
|
||||||
|
)
|
||||||
|
db.add(notif)
|
||||||
|
|
||||||
|
|
||||||
|
def save_to_inbox(db: Session, raw_object: ap.RawObject) -> None:
|
||||||
|
try:
|
||||||
|
actor = fetch_actor(db, raw_object["actor"])
|
||||||
|
except httpx.HTTPStatusError:
|
||||||
|
logger.exception("Failed to fetch actor")
|
||||||
|
# XXX: Delete 410 when we never seen the actor
|
||||||
|
return
|
||||||
|
|
||||||
|
ap_published_at = now()
|
||||||
|
if "published" in raw_object:
|
||||||
|
ap_published_at = isoparse(raw_object["published"])
|
||||||
|
|
||||||
|
ra = RemoteObject(ap.unwrap_activity(raw_object), actor=actor)
|
||||||
|
relates_to_inbox_object: models.InboxObject | None = None
|
||||||
|
relates_to_outbox_object: models.OutboxObject | None = None
|
||||||
|
if ra.activity_object_ap_id:
|
||||||
|
if ra.activity_object_ap_id.startswith(BASE_URL):
|
||||||
|
relates_to_outbox_object = get_outbox_object_by_ap_id(
|
||||||
|
db,
|
||||||
|
ra.activity_object_ap_id,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
relates_to_inbox_object = get_inbox_object_by_ap_id(
|
||||||
|
db,
|
||||||
|
ra.activity_object_ap_id,
|
||||||
|
)
|
||||||
|
|
||||||
|
inbox_object = models.InboxObject(
|
||||||
|
server=urlparse(ra.ap_id).netloc,
|
||||||
|
actor_id=actor.id,
|
||||||
|
ap_actor_id=actor.ap_id,
|
||||||
|
ap_type=ra.ap_type,
|
||||||
|
ap_id=ra.ap_id,
|
||||||
|
ap_context=ra.context,
|
||||||
|
ap_published_at=ap_published_at,
|
||||||
|
ap_object=ra.ap_object,
|
||||||
|
visibility=ra.visibility,
|
||||||
|
relates_to_inbox_object_id=relates_to_inbox_object.id
|
||||||
|
if relates_to_inbox_object
|
||||||
|
else None,
|
||||||
|
relates_to_outbox_object_id=relates_to_outbox_object.id
|
||||||
|
if relates_to_outbox_object
|
||||||
|
else None,
|
||||||
|
activity_object_ap_id=ra.activity_object_ap_id,
|
||||||
|
# Hide replies from the stream
|
||||||
|
is_hidden_from_stream=True if ra.in_reply_to else False,
|
||||||
|
)
|
||||||
|
|
||||||
|
db.add(inbox_object)
|
||||||
|
db.flush()
|
||||||
|
db.refresh(inbox_object)
|
||||||
|
|
||||||
|
if ra.ap_type == "Create":
|
||||||
|
_handle_create_activity(db, actor, inbox_object)
|
||||||
|
elif ra.ap_type == "Update":
|
||||||
|
pass
|
||||||
|
elif ra.ap_type == "Delete":
|
||||||
|
if relates_to_inbox_object:
|
||||||
|
_handle_delete_activity(db, actor, relates_to_inbox_object)
|
||||||
|
else:
|
||||||
|
# TODO(ts): handle delete actor
|
||||||
|
logger.info(
|
||||||
|
f"Received a Delete for an unknown object: {ra.activity_object_ap_id}"
|
||||||
|
)
|
||||||
|
elif ra.ap_type == "Follow":
|
||||||
|
_handle_follow_follow_activity(db, actor, inbox_object)
|
||||||
|
elif ra.ap_type == "Undo":
|
||||||
|
if relates_to_inbox_object:
|
||||||
|
_handle_undo_activity(db, actor, inbox_object, relates_to_inbox_object)
|
||||||
|
else:
|
||||||
|
logger.info("Received Undo for an unknown activity")
|
||||||
|
elif ra.ap_type in ["Accept", "Reject"]:
|
||||||
|
if not relates_to_outbox_object:
|
||||||
|
logger.info(
|
||||||
|
f"Received {raw_object['type']} for an unknown activity: "
|
||||||
|
f"{ra.activity_object_ap_id}"
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
if relates_to_outbox_object.ap_type == "Follow":
|
||||||
|
following = models.Following(
|
||||||
|
actor_id=actor.id,
|
||||||
|
outbox_object_id=relates_to_outbox_object.id,
|
||||||
|
ap_actor_id=actor.ap_id,
|
||||||
|
)
|
||||||
|
db.add(following)
|
||||||
|
else:
|
||||||
|
logger.info(
|
||||||
|
"Received an Accept for an unsupported activity: "
|
||||||
|
f"{relates_to_outbox_object.ap_type}"
|
||||||
|
)
|
||||||
|
elif ra.ap_type == "Like":
|
||||||
|
if not relates_to_outbox_object:
|
||||||
|
logger.info(
|
||||||
|
f"Received a like for an unknown activity: {ra.activity_object_ap_id}"
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
relates_to_outbox_object.likes_count = models.OutboxObject.likes_count + 1
|
||||||
|
|
||||||
|
notif = models.Notification(
|
||||||
|
notification_type=models.NotificationType.LIKE,
|
||||||
|
actor_id=actor.id,
|
||||||
|
outbox_object_id=relates_to_outbox_object.id,
|
||||||
|
inbox_object_id=inbox_object.id,
|
||||||
|
)
|
||||||
|
db.add(notif)
|
||||||
|
elif raw_object["type"] == "Announce":
|
||||||
|
if relates_to_outbox_object:
|
||||||
|
# This is an announce for a local object
|
||||||
|
relates_to_outbox_object.announces_count = (
|
||||||
|
models.OutboxObject.announces_count + 1
|
||||||
|
)
|
||||||
|
|
||||||
|
notif = models.Notification(
|
||||||
|
notification_type=models.NotificationType.ANNOUNCE,
|
||||||
|
actor_id=actor.id,
|
||||||
|
outbox_object_id=relates_to_outbox_object.id,
|
||||||
|
inbox_object_id=inbox_object.id,
|
||||||
|
)
|
||||||
|
db.add(notif)
|
||||||
|
else:
|
||||||
|
# This is announce for a maybe unknown object
|
||||||
|
if relates_to_inbox_object:
|
||||||
|
logger.info("Nothing to do, we already know about this object")
|
||||||
|
else:
|
||||||
|
# Save it as an inbox object
|
||||||
|
if not ra.activity_object_ap_id:
|
||||||
|
raise ValueError("Should never happen")
|
||||||
|
announced_raw_object = ap.fetch(ra.activity_object_ap_id)
|
||||||
|
announced_actor = fetch_actor(db, ap.get_actor_id(announced_raw_object))
|
||||||
|
announced_object = RemoteObject(announced_raw_object, announced_actor)
|
||||||
|
announced_inbox_object = models.InboxObject(
|
||||||
|
server=urlparse(announced_object.ap_id).netloc,
|
||||||
|
actor_id=announced_actor.id,
|
||||||
|
ap_actor_id=announced_actor.ap_id,
|
||||||
|
ap_type=announced_object.ap_type,
|
||||||
|
ap_id=announced_object.ap_id,
|
||||||
|
ap_context=announced_object.context,
|
||||||
|
ap_published_at=announced_object.ap_published_at,
|
||||||
|
ap_object=announced_object.ap_object,
|
||||||
|
visibility=announced_object.visibility,
|
||||||
|
is_hidden_from_stream=True,
|
||||||
|
)
|
||||||
|
db.add(announced_inbox_object)
|
||||||
|
db.flush()
|
||||||
|
inbox_object.relates_to_inbox_object_id = announced_inbox_object.id
|
||||||
|
elif ra.ap_type in ["Like", "Announce"]:
|
||||||
|
if not relates_to_outbox_object:
|
||||||
|
logger.info(
|
||||||
|
f"Received {ra.ap_type} for an unknown activity: "
|
||||||
|
f"{ra.activity_object_ap_id}"
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
if ra.ap_type == "Like":
|
||||||
|
# TODO(ts): notification
|
||||||
|
relates_to_outbox_object.likes_count = (
|
||||||
|
models.OutboxObject.likes_count + 1
|
||||||
|
)
|
||||||
|
|
||||||
|
notif = models.Notification(
|
||||||
|
notification_type=models.NotificationType.LIKE,
|
||||||
|
actor_id=actor.id,
|
||||||
|
outbox_object_id=relates_to_outbox_object.id,
|
||||||
|
inbox_object_id=inbox_object.id,
|
||||||
|
)
|
||||||
|
db.add(notif)
|
||||||
|
elif raw_object["type"] == "Announce":
|
||||||
|
# TODO(ts): notification
|
||||||
|
relates_to_outbox_object.announces_count = (
|
||||||
|
models.OutboxObject.announces_count + 1
|
||||||
|
)
|
||||||
|
|
||||||
|
notif = models.Notification(
|
||||||
|
notification_type=models.NotificationType.ANNOUNCE,
|
||||||
|
actor_id=actor.id,
|
||||||
|
outbox_object_id=relates_to_outbox_object.id,
|
||||||
|
inbox_object_id=inbox_object.id,
|
||||||
|
)
|
||||||
|
db.add(notif)
|
||||||
|
else:
|
||||||
|
raise ValueError("Should never happpen")
|
||||||
|
|
||||||
|
else:
|
||||||
|
logger.warning(f"Received an unknown {inbox_object.ap_type} object")
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
|
||||||
|
|
||||||
|
def public_outbox_objects_count(db: Session) -> int:
|
||||||
|
return (
|
||||||
|
db.query(models.OutboxObject)
|
||||||
|
.filter(
|
||||||
|
models.OutboxObject.visibility == ap.VisibilityEnum.PUBLIC,
|
||||||
|
models.OutboxObject.is_deleted.is_(False),
|
||||||
|
)
|
||||||
|
.count()
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def fetch_collection(db: Session, url: str) -> list[ap.RawObject]:
|
||||||
|
if url.startswith(config.BASE_URL):
|
||||||
|
if url == config.BASE_URL + "/followers":
|
||||||
|
q = db.query(models.Follower).options(joinedload(models.Follower.actor))
|
||||||
|
return [follower.actor.ap_actor for follower in q.all()]
|
||||||
|
else:
|
||||||
|
raise ValueError(f"internal collection for {url}) not supported")
|
||||||
|
|
||||||
|
return ap.parse_collection(url)
|
93
app/config.py
Normal file
93
app/config.py
Normal file
|
@ -0,0 +1,93 @@
|
||||||
|
import os
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
import bcrypt
|
||||||
|
import pydantic
|
||||||
|
import tomli
|
||||||
|
from fastapi import Form
|
||||||
|
from fastapi import HTTPException
|
||||||
|
from fastapi import Request
|
||||||
|
from itsdangerous import TimedSerializer
|
||||||
|
from itsdangerous import TimestampSigner
|
||||||
|
|
||||||
|
ROOT_DIR = Path().parent.resolve()
|
||||||
|
|
||||||
|
_CONFIG_FILE = os.getenv("MICROBLOGPUB_CONFIG_FILE", "me.toml")
|
||||||
|
|
||||||
|
VERSION = "2.0"
|
||||||
|
USER_AGENT = f"microblogpub/{VERSION}"
|
||||||
|
AP_CONTENT_TYPE = "application/activity+json"
|
||||||
|
|
||||||
|
|
||||||
|
class Config(pydantic.BaseModel):
|
||||||
|
domain: str
|
||||||
|
username: str
|
||||||
|
admin_password: bytes
|
||||||
|
name: str
|
||||||
|
summary: str
|
||||||
|
https: bool
|
||||||
|
icon_url: str
|
||||||
|
secret: str
|
||||||
|
debug: bool = False
|
||||||
|
|
||||||
|
# Config items to make tests easier
|
||||||
|
sqlalchemy_database_url: str | None = None
|
||||||
|
key_path: str | None = None
|
||||||
|
|
||||||
|
|
||||||
|
def load_config() -> Config:
|
||||||
|
try:
|
||||||
|
return Config.parse_obj(
|
||||||
|
tomli.loads((ROOT_DIR / "data" / _CONFIG_FILE).read_text())
|
||||||
|
)
|
||||||
|
except FileNotFoundError:
|
||||||
|
raise ValueError("Please run the configuration wizard")
|
||||||
|
|
||||||
|
|
||||||
|
def is_activitypub_requested(req: Request) -> bool:
|
||||||
|
accept_value = req.headers.get("accept")
|
||||||
|
if not accept_value:
|
||||||
|
return False
|
||||||
|
for val in {
|
||||||
|
"application/ld+json",
|
||||||
|
"application/activity+json",
|
||||||
|
}:
|
||||||
|
if accept_value.startswith(val):
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def verify_password(pwd: str) -> bool:
|
||||||
|
return bcrypt.checkpw(pwd.encode(), CONFIG.admin_password)
|
||||||
|
|
||||||
|
|
||||||
|
CONFIG = load_config()
|
||||||
|
DOMAIN = CONFIG.domain
|
||||||
|
_SCHEME = "https" if CONFIG.https else "http"
|
||||||
|
ID = f"{_SCHEME}://{DOMAIN}"
|
||||||
|
USERNAME = CONFIG.username
|
||||||
|
BASE_URL = ID
|
||||||
|
DEBUG = CONFIG.debug
|
||||||
|
DB_PATH = ROOT_DIR / "data" / "microblogpub.db"
|
||||||
|
SQLALCHEMY_DATABASE_URL = CONFIG.sqlalchemy_database_url or f"sqlite:///{DB_PATH}"
|
||||||
|
KEY_PATH = (
|
||||||
|
(ROOT_DIR / CONFIG.key_path) if CONFIG.key_path else ROOT_DIR / "data" / "key.pem"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
session_serializer = TimedSerializer(CONFIG.secret, salt="microblogpub.login")
|
||||||
|
csrf_signer = TimestampSigner(
|
||||||
|
os.urandom(16).hex(),
|
||||||
|
salt=os.urandom(16).hex(),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def generate_csrf_token() -> str:
|
||||||
|
return csrf_signer.sign(os.urandom(16).hex()).decode()
|
||||||
|
|
||||||
|
|
||||||
|
def verify_csrf_token(csrf_token: str = Form()) -> None:
|
||||||
|
if not csrf_signer.validate(csrf_token, max_age=600):
|
||||||
|
raise HTTPException(status_code=403, detail="CSRF error")
|
||||||
|
return None
|
29
app/database.py
Normal file
29
app/database.py
Normal file
|
@ -0,0 +1,29 @@
|
||||||
|
import datetime
|
||||||
|
from typing import Any
|
||||||
|
from typing import Generator
|
||||||
|
|
||||||
|
from sqlalchemy import create_engine
|
||||||
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy.orm import sessionmaker
|
||||||
|
|
||||||
|
from app.config import SQLALCHEMY_DATABASE_URL
|
||||||
|
|
||||||
|
engine = create_engine(
|
||||||
|
SQLALCHEMY_DATABASE_URL, connect_args={"check_same_thread": False}
|
||||||
|
)
|
||||||
|
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||||
|
|
||||||
|
Base: Any = declarative_base()
|
||||||
|
|
||||||
|
|
||||||
|
def now() -> datetime.datetime:
|
||||||
|
return datetime.datetime.now(datetime.timezone.utc)
|
||||||
|
|
||||||
|
|
||||||
|
def get_db() -> Generator[Session, None, None]:
|
||||||
|
db = SessionLocal()
|
||||||
|
try:
|
||||||
|
yield db
|
||||||
|
finally:
|
||||||
|
db.close()
|
27
app/highlight.py
Normal file
27
app/highlight.py
Normal file
|
@ -0,0 +1,27 @@
|
||||||
|
from functools import lru_cache
|
||||||
|
|
||||||
|
from bs4 import BeautifulSoup # type: ignore
|
||||||
|
from pygments import highlight as phighlight # type: ignore
|
||||||
|
from pygments.formatters import HtmlFormatter # type: ignore
|
||||||
|
from pygments.lexers import guess_lexer # type: ignore
|
||||||
|
|
||||||
|
_FORMATTER = HtmlFormatter(style="vim")
|
||||||
|
|
||||||
|
HIGHLIGHT_CSS = _FORMATTER.get_style_defs()
|
||||||
|
|
||||||
|
|
||||||
|
@lru_cache(256)
|
||||||
|
def highlight(html: str) -> str:
|
||||||
|
soup = BeautifulSoup(html, "html5lib")
|
||||||
|
for code in soup.find_all("code"):
|
||||||
|
if not code.parent.name == "pre":
|
||||||
|
continue
|
||||||
|
lexer = guess_lexer(code.text)
|
||||||
|
tag = BeautifulSoup(
|
||||||
|
phighlight(code.text, lexer, _FORMATTER), "html5lib"
|
||||||
|
).body.next
|
||||||
|
pre = code.parent
|
||||||
|
pre.replaceWith(tag)
|
||||||
|
out = soup.body
|
||||||
|
out.name = "div"
|
||||||
|
return str(out)
|
182
app/httpsig.py
Normal file
182
app/httpsig.py
Normal file
|
@ -0,0 +1,182 @@
|
||||||
|
"""Implements HTTP signature for Flask requests.
|
||||||
|
|
||||||
|
Mastodon instances won't accept requests that are not signed using this scheme.
|
||||||
|
|
||||||
|
"""
|
||||||
|
import base64
|
||||||
|
import hashlib
|
||||||
|
import typing
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from datetime import datetime
|
||||||
|
from functools import lru_cache
|
||||||
|
from typing import Any
|
||||||
|
from typing import Dict
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
import fastapi
|
||||||
|
import httpx
|
||||||
|
from Crypto.Hash import SHA256
|
||||||
|
from Crypto.Signature import PKCS1_v1_5
|
||||||
|
from loguru import logger
|
||||||
|
|
||||||
|
from app import config
|
||||||
|
from app.key import Key
|
||||||
|
from app.key import get_key
|
||||||
|
|
||||||
|
|
||||||
|
def _build_signed_string(
|
||||||
|
signed_headers: str, method: str, path: str, headers: Any, body_digest: str | None
|
||||||
|
) -> str:
|
||||||
|
out = []
|
||||||
|
for signed_header in signed_headers.split(" "):
|
||||||
|
if signed_header == "(request-target)":
|
||||||
|
out.append("(request-target): " + method.lower() + " " + path)
|
||||||
|
elif signed_header == "digest" and body_digest:
|
||||||
|
out.append("digest: " + body_digest)
|
||||||
|
else:
|
||||||
|
out.append(signed_header + ": " + headers[signed_header])
|
||||||
|
return "\n".join(out)
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_sig_header(val: Optional[str]) -> Optional[Dict[str, str]]:
|
||||||
|
if not val:
|
||||||
|
return None
|
||||||
|
out = {}
|
||||||
|
for data in val.split(","):
|
||||||
|
k, v = data.split("=", 1)
|
||||||
|
out[k] = v[1 : len(v) - 1] # noqa: black conflict
|
||||||
|
return out
|
||||||
|
|
||||||
|
|
||||||
|
def _verify_h(signed_string, signature, pubkey):
|
||||||
|
signer = PKCS1_v1_5.new(pubkey)
|
||||||
|
digest = SHA256.new()
|
||||||
|
digest.update(signed_string.encode("utf-8"))
|
||||||
|
return signer.verify(digest, signature)
|
||||||
|
|
||||||
|
|
||||||
|
def _body_digest(body: bytes) -> str:
|
||||||
|
h = hashlib.new("sha256")
|
||||||
|
h.update(body) # type: ignore
|
||||||
|
return "SHA-256=" + base64.b64encode(h.digest()).decode("utf-8")
|
||||||
|
|
||||||
|
|
||||||
|
@lru_cache(32)
|
||||||
|
def _get_public_key(key_id: str) -> Key:
|
||||||
|
from app import activitypub as ap
|
||||||
|
|
||||||
|
actor = ap.fetch(key_id)
|
||||||
|
if actor["type"] == "Key":
|
||||||
|
# The Key is not embedded in the Person
|
||||||
|
k = Key(actor["owner"], actor["id"])
|
||||||
|
k.load_pub(actor["publicKeyPem"])
|
||||||
|
else:
|
||||||
|
k = Key(actor["id"], actor["publicKey"]["id"])
|
||||||
|
k.load_pub(actor["publicKey"]["publicKeyPem"])
|
||||||
|
|
||||||
|
# Ensure the right key was fetch
|
||||||
|
if key_id != k.key_id():
|
||||||
|
raise ValueError(
|
||||||
|
f"failed to fetch requested key {key_id}: got {actor['publicKey']['id']}"
|
||||||
|
)
|
||||||
|
|
||||||
|
return k
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class HTTPSigInfo:
|
||||||
|
has_valid_signature: bool
|
||||||
|
signed_by_ap_actor_id: str | None = None
|
||||||
|
|
||||||
|
|
||||||
|
async def httpsig_checker(
|
||||||
|
request: fastapi.Request,
|
||||||
|
) -> HTTPSigInfo:
|
||||||
|
body = await request.body()
|
||||||
|
|
||||||
|
hsig = _parse_sig_header(request.headers.get("Signature"))
|
||||||
|
if not hsig:
|
||||||
|
logger.info("No HTTP signature found")
|
||||||
|
return HTTPSigInfo(has_valid_signature=False)
|
||||||
|
|
||||||
|
logger.debug(f"hsig={hsig}")
|
||||||
|
signed_string = _build_signed_string(
|
||||||
|
hsig["headers"],
|
||||||
|
request.method,
|
||||||
|
request.url.path,
|
||||||
|
request.headers,
|
||||||
|
_body_digest(body) if body else None,
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
k = _get_public_key(hsig["keyId"])
|
||||||
|
except Exception:
|
||||||
|
logger.exception(f'Failed to fetch HTTP sig key {hsig["keyId"]}')
|
||||||
|
return HTTPSigInfo(has_valid_signature=False)
|
||||||
|
|
||||||
|
httpsig_info = HTTPSigInfo(
|
||||||
|
has_valid_signature=_verify_h(
|
||||||
|
signed_string, base64.b64decode(hsig["signature"]), k.pubkey
|
||||||
|
),
|
||||||
|
signed_by_ap_actor_id=k.owner,
|
||||||
|
)
|
||||||
|
logger.info(f"Valid HTTP signature for {httpsig_info.signed_by_ap_actor_id}")
|
||||||
|
return httpsig_info
|
||||||
|
|
||||||
|
|
||||||
|
async def enforce_httpsig(
|
||||||
|
request: fastapi.Request,
|
||||||
|
httpsig_info: HTTPSigInfo = fastapi.Depends(httpsig_checker),
|
||||||
|
) -> HTTPSigInfo:
|
||||||
|
if not httpsig_info.has_valid_signature:
|
||||||
|
logger.warning(f"Invalid HTTP sig {httpsig_info=}")
|
||||||
|
body = await request.body()
|
||||||
|
logger.info(f"{body=}")
|
||||||
|
raise fastapi.HTTPException(status_code=401, detail="Invalid HTTP sig")
|
||||||
|
|
||||||
|
return httpsig_info
|
||||||
|
|
||||||
|
|
||||||
|
class HTTPXSigAuth(httpx.Auth):
|
||||||
|
def __init__(self, key: Key) -> None:
|
||||||
|
self.key = key
|
||||||
|
|
||||||
|
def auth_flow(
|
||||||
|
self, r: httpx.Request
|
||||||
|
) -> typing.Generator[httpx.Request, httpx.Response, None]:
|
||||||
|
logger.info(f"keyid={self.key.key_id()}")
|
||||||
|
|
||||||
|
bodydigest = None
|
||||||
|
if r.content:
|
||||||
|
bh = hashlib.new("sha256")
|
||||||
|
bh.update(r.content)
|
||||||
|
bodydigest = "SHA-256=" + base64.b64encode(bh.digest()).decode("utf-8")
|
||||||
|
|
||||||
|
date = datetime.utcnow().strftime("%a, %d %b %Y %H:%M:%S GMT")
|
||||||
|
r.headers["Date"] = date
|
||||||
|
if bodydigest:
|
||||||
|
r.headers["Digest"] = bodydigest
|
||||||
|
sigheaders = "(request-target) user-agent host date digest content-type"
|
||||||
|
else:
|
||||||
|
sigheaders = "(request-target) user-agent host date accept"
|
||||||
|
|
||||||
|
to_be_signed = _build_signed_string(
|
||||||
|
sigheaders, r.method, r.url.path, r.headers, bodydigest
|
||||||
|
)
|
||||||
|
if not self.key.privkey:
|
||||||
|
raise ValueError("Should never happen")
|
||||||
|
signer = PKCS1_v1_5.new(self.key.privkey)
|
||||||
|
digest = SHA256.new()
|
||||||
|
digest.update(to_be_signed.encode("utf-8"))
|
||||||
|
sig = base64.b64encode(signer.sign(digest)).decode()
|
||||||
|
|
||||||
|
key_id = self.key.key_id()
|
||||||
|
sig_value = f'keyId="{key_id}",algorithm="rsa-sha256",headers="{sigheaders}",signature="{sig}"' # noqa: E501
|
||||||
|
logger.debug(f"signed request {sig_value=}")
|
||||||
|
r.headers["Signature"] = sig_value
|
||||||
|
yield r
|
||||||
|
|
||||||
|
|
||||||
|
k = Key(config.ID, f"{config.ID}#main-key")
|
||||||
|
k.load(get_key())
|
||||||
|
auth = HTTPXSigAuth(k)
|
84
app/key.py
Normal file
84
app/key.py
Normal file
|
@ -0,0 +1,84 @@
|
||||||
|
import base64
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from Crypto.PublicKey import RSA
|
||||||
|
from Crypto.Util import number
|
||||||
|
|
||||||
|
from app.config import KEY_PATH
|
||||||
|
|
||||||
|
|
||||||
|
def key_exists() -> bool:
|
||||||
|
return KEY_PATH.exists()
|
||||||
|
|
||||||
|
|
||||||
|
def generate_key() -> None:
|
||||||
|
if key_exists():
|
||||||
|
raise ValueError(f"Key at {KEY_PATH} already exists")
|
||||||
|
k = RSA.generate(2048)
|
||||||
|
privkey_pem = k.exportKey("PEM").decode("utf-8")
|
||||||
|
KEY_PATH.write_text(privkey_pem)
|
||||||
|
|
||||||
|
|
||||||
|
def get_pubkey_as_pem() -> str:
|
||||||
|
text = KEY_PATH.read_text()
|
||||||
|
return RSA.import_key(text).public_key().export_key("PEM").decode("utf-8")
|
||||||
|
|
||||||
|
|
||||||
|
def get_key() -> str:
|
||||||
|
return KEY_PATH.read_text()
|
||||||
|
|
||||||
|
|
||||||
|
class Key(object):
|
||||||
|
DEFAULT_KEY_SIZE = 2048
|
||||||
|
|
||||||
|
def __init__(self, owner: str, id_: str | None = None) -> None:
|
||||||
|
self.owner = owner
|
||||||
|
self.privkey_pem: str | None = None
|
||||||
|
self.pubkey_pem: str | None = None
|
||||||
|
self.privkey: RSA.RsaKey | None = None
|
||||||
|
self.pubkey: RSA.RsaKey | None = None
|
||||||
|
self.id_ = id_
|
||||||
|
|
||||||
|
def load_pub(self, pubkey_pem: str) -> None:
|
||||||
|
self.pubkey_pem = pubkey_pem
|
||||||
|
self.pubkey = RSA.importKey(pubkey_pem)
|
||||||
|
|
||||||
|
def load(self, privkey_pem: str) -> None:
|
||||||
|
self.privkey_pem = privkey_pem
|
||||||
|
self.privkey = RSA.importKey(self.privkey_pem)
|
||||||
|
self.pubkey_pem = self.privkey.publickey().exportKey("PEM").decode("utf-8")
|
||||||
|
|
||||||
|
def new(self) -> None:
|
||||||
|
k = RSA.generate(self.DEFAULT_KEY_SIZE)
|
||||||
|
self.privkey_pem = k.exportKey("PEM").decode("utf-8")
|
||||||
|
self.pubkey_pem = k.publickey().exportKey("PEM").decode("utf-8")
|
||||||
|
self.privkey = k
|
||||||
|
|
||||||
|
def key_id(self) -> str:
|
||||||
|
return self.id_ or f"{self.owner}#main-key"
|
||||||
|
|
||||||
|
def to_dict(self) -> dict[str, Any]:
|
||||||
|
return {
|
||||||
|
"id": self.key_id(),
|
||||||
|
"owner": self.owner,
|
||||||
|
"publicKeyPem": self.pubkey_pem,
|
||||||
|
"type": "Key",
|
||||||
|
}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_dict(cls, data):
|
||||||
|
try:
|
||||||
|
k = cls(data["owner"], data["id"])
|
||||||
|
k.load_pub(data["publicKeyPem"])
|
||||||
|
except KeyError:
|
||||||
|
raise ValueError(f"bad key data {data!r}")
|
||||||
|
return k
|
||||||
|
|
||||||
|
def to_magic_key(self) -> str:
|
||||||
|
mod = base64.urlsafe_b64encode(
|
||||||
|
number.long_to_bytes(self.privkey.n) # type: ignore
|
||||||
|
).decode("utf-8")
|
||||||
|
pubexp = base64.urlsafe_b64encode(
|
||||||
|
number.long_to_bytes(self.privkey.e) # type: ignore
|
||||||
|
).decode("utf-8")
|
||||||
|
return f"data:application/magic-public-key,RSA.{mod}.{pubexp}"
|
40
app/lookup.py
Normal file
40
app/lookup.py
Normal file
|
@ -0,0 +1,40 @@
|
||||||
|
import mf2py # type: ignore
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
|
||||||
|
from app import activitypub as ap
|
||||||
|
from app import webfinger
|
||||||
|
from app.actor import Actor
|
||||||
|
from app.actor import fetch_actor
|
||||||
|
from app.ap_object import RemoteObject
|
||||||
|
|
||||||
|
|
||||||
|
def lookup(db: Session, query: str) -> Actor | RemoteObject:
|
||||||
|
if query.startswith("@"):
|
||||||
|
query = webfinger.get_actor_url(query) # type: ignore # None check below
|
||||||
|
|
||||||
|
if not query:
|
||||||
|
raise ap.NotAnObjectError(query)
|
||||||
|
|
||||||
|
try:
|
||||||
|
ap_obj = ap.fetch(query)
|
||||||
|
except ap.NotAnObjectError as not_an_object_error:
|
||||||
|
resp = not_an_object_error.resp
|
||||||
|
if not resp:
|
||||||
|
raise ap.NotAnObjectError(query)
|
||||||
|
|
||||||
|
alternate_obj = None
|
||||||
|
if resp.headers.get("content-type", "").startswith("text/html"):
|
||||||
|
for alternate in mf2py.parse(doc=resp.text).get("alternates", []):
|
||||||
|
if alternate.get("type") == "application/activity+json":
|
||||||
|
alternate_obj = ap.fetch(alternate["url"])
|
||||||
|
|
||||||
|
if alternate_obj:
|
||||||
|
ap_obj = alternate_obj
|
||||||
|
else:
|
||||||
|
raise
|
||||||
|
|
||||||
|
if ap_obj["type"] in ap.ACTOR_TYPES:
|
||||||
|
actor = fetch_actor(db, ap_obj["id"])
|
||||||
|
return actor
|
||||||
|
else:
|
||||||
|
return RemoteObject(ap_obj)
|
558
app/main.py
Normal file
558
app/main.py
Normal file
|
@ -0,0 +1,558 @@
|
||||||
|
import base64
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Any
|
||||||
|
from typing import Type
|
||||||
|
|
||||||
|
import httpx
|
||||||
|
from dateutil.parser import isoparse
|
||||||
|
from fastapi import Depends
|
||||||
|
from fastapi import FastAPI
|
||||||
|
from fastapi import Request
|
||||||
|
from fastapi import Response
|
||||||
|
from fastapi.exceptions import HTTPException
|
||||||
|
from fastapi.responses import PlainTextResponse
|
||||||
|
from fastapi.responses import StreamingResponse
|
||||||
|
from fastapi.staticfiles import StaticFiles
|
||||||
|
from loguru import logger
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from sqlalchemy.orm import joinedload
|
||||||
|
from starlette.background import BackgroundTask
|
||||||
|
from starlette.responses import JSONResponse
|
||||||
|
|
||||||
|
from app import activitypub as ap
|
||||||
|
from app import admin
|
||||||
|
from app import config
|
||||||
|
from app import httpsig
|
||||||
|
from app import models
|
||||||
|
from app import templates
|
||||||
|
from app.actor import LOCAL_ACTOR
|
||||||
|
from app.actor import get_actors_metadata
|
||||||
|
from app.boxes import public_outbox_objects_count
|
||||||
|
from app.boxes import save_to_inbox
|
||||||
|
from app.config import BASE_URL
|
||||||
|
from app.config import DEBUG
|
||||||
|
from app.config import DOMAIN
|
||||||
|
from app.config import ID
|
||||||
|
from app.config import USER_AGENT
|
||||||
|
from app.config import USERNAME
|
||||||
|
from app.config import is_activitypub_requested
|
||||||
|
from app.database import get_db
|
||||||
|
from app.templates import is_current_user_admin
|
||||||
|
|
||||||
|
# TODO(ts):
|
||||||
|
#
|
||||||
|
# Next:
|
||||||
|
# - show likes/announces counter for outbox activities
|
||||||
|
# - update actor support
|
||||||
|
# - replies support
|
||||||
|
# - file upload + place/exif extraction (or not) support
|
||||||
|
# - custom emoji support
|
||||||
|
# - hash config/profile to detect when to send Update actor
|
||||||
|
#
|
||||||
|
# - [ ] block support
|
||||||
|
# - [ ] make the media proxy authenticated
|
||||||
|
# - [ ] prevent SSRF (urlutils from little-boxes)
|
||||||
|
# - [ ] Dockerization
|
||||||
|
# - [ ] Webmentions
|
||||||
|
# - [ ] custom emoji
|
||||||
|
# - [ ] poll/questions support
|
||||||
|
# - [ ] cleanup tasks
|
||||||
|
# - notifs:
|
||||||
|
# - MENTIONED
|
||||||
|
# - LIKED
|
||||||
|
# - ANNOUNCED
|
||||||
|
# - FOLLOWED
|
||||||
|
# - UNFOLLOWED
|
||||||
|
# - POLL_ENDED
|
||||||
|
|
||||||
|
app = FastAPI(docs_url=None, redoc_url=None)
|
||||||
|
app.mount("/static", StaticFiles(directory="app/static"), name="static")
|
||||||
|
app.include_router(admin.router, prefix="/admin")
|
||||||
|
app.include_router(admin.unauthenticated_router, prefix="/admin")
|
||||||
|
|
||||||
|
logger.configure(extra={"request_id": "no_req_id"})
|
||||||
|
logger.remove()
|
||||||
|
logger_format = (
|
||||||
|
"<green>{time:YYYY-MM-DD HH:mm:ss.SSS}</green> | "
|
||||||
|
"<level>{level: <8}</level> | "
|
||||||
|
"<cyan>{name}</cyan>:<cyan>{function}</cyan>:<cyan>{line}</cyan> | "
|
||||||
|
"{extra[request_id]} - <level>{message}</level>"
|
||||||
|
)
|
||||||
|
logger.add(sys.stdout, format=logger_format)
|
||||||
|
|
||||||
|
|
||||||
|
@app.middleware("http")
|
||||||
|
async def request_middleware(request, call_next):
|
||||||
|
start_time = time.perf_counter()
|
||||||
|
request_id = os.urandom(8).hex()
|
||||||
|
with logger.contextualize(request_id=request_id):
|
||||||
|
logger.info(
|
||||||
|
f"{request.client.host}:{request.client.port} - "
|
||||||
|
f"{request.method} {request.url}"
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
response = await call_next(request)
|
||||||
|
response.headers["X-Request-ID"] = request_id
|
||||||
|
response.headers["Server"] = "microblogpub"
|
||||||
|
elapsed_time = time.perf_counter() - start_time
|
||||||
|
logger.info(f"status_code={response.status_code} {elapsed_time=:.2f}s")
|
||||||
|
return response
|
||||||
|
except Exception:
|
||||||
|
logger.exception("Request failed")
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
@app.middleware("http")
|
||||||
|
async def add_security_headers(request: Request, call_next):
|
||||||
|
response = await call_next(request)
|
||||||
|
response.headers["referrer-policy"] = "no-referrer, strict-origin-when-cross-origin"
|
||||||
|
response.headers["x-content-type-options"] = "nosniff"
|
||||||
|
response.headers["x-xss-protection"] = "1; mode=block"
|
||||||
|
response.headers["x-frame-options"] = "SAMEORIGIN"
|
||||||
|
# TODO(ts): disallow inline CSS?
|
||||||
|
response.headers["content-security-policy"] = (
|
||||||
|
"default-src 'self'" + " style-src 'self' 'unsafe-inline';"
|
||||||
|
)
|
||||||
|
if not DEBUG:
|
||||||
|
response.headers[
|
||||||
|
"strict-transport-security"
|
||||||
|
] = "max-age=63072000; includeSubdomains"
|
||||||
|
return response
|
||||||
|
|
||||||
|
|
||||||
|
DEFAULT_CTX = COLLECTION_CTX = [
|
||||||
|
"https://www.w3.org/ns/activitystreams",
|
||||||
|
"https://w3id.org/security/v1",
|
||||||
|
{
|
||||||
|
# AS ext
|
||||||
|
"Hashtag": "as:Hashtag",
|
||||||
|
"sensitive": "as:sensitive",
|
||||||
|
"manuallyApprovesFollowers": "as:manuallyApprovesFollowers",
|
||||||
|
# toot
|
||||||
|
"toot": "http://joinmastodon.org/ns#",
|
||||||
|
# "featured": "toot:featured",
|
||||||
|
# schema
|
||||||
|
"schema": "http://schema.org#",
|
||||||
|
"PropertyValue": "schema:PropertyValue",
|
||||||
|
"value": "schema:value",
|
||||||
|
},
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
class ActivityPubResponse(JSONResponse):
|
||||||
|
media_type = "application/activity+json"
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/")
|
||||||
|
def index(
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_: httpsig.HTTPSigInfo = Depends(httpsig.httpsig_checker),
|
||||||
|
) -> templates.TemplateResponse | ActivityPubResponse:
|
||||||
|
if is_activitypub_requested(request):
|
||||||
|
return ActivityPubResponse(LOCAL_ACTOR.ap_actor)
|
||||||
|
|
||||||
|
outbox_objects = (
|
||||||
|
db.query(models.OutboxObject)
|
||||||
|
.filter(
|
||||||
|
models.OutboxObject.visibility == ap.VisibilityEnum.PUBLIC,
|
||||||
|
models.OutboxObject.is_deleted.is_(False),
|
||||||
|
models.OutboxObject.is_hidden_from_homepage.is_(False),
|
||||||
|
)
|
||||||
|
.order_by(models.OutboxObject.ap_published_at.desc())
|
||||||
|
.limit(20)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
return templates.render_template(
|
||||||
|
db,
|
||||||
|
request,
|
||||||
|
"index.html",
|
||||||
|
{"request": request, "objects": outbox_objects},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _build_followx_collection(
|
||||||
|
db: Session,
|
||||||
|
model_cls: Type[models.Following | models.Follower],
|
||||||
|
path: str,
|
||||||
|
page: bool | None,
|
||||||
|
next_cursor: str | None,
|
||||||
|
) -> ap.RawObject:
|
||||||
|
total_items = db.query(model_cls).count()
|
||||||
|
|
||||||
|
if not page and not next_cursor:
|
||||||
|
return {
|
||||||
|
"@context": ap.AS_CTX,
|
||||||
|
"id": ID + path,
|
||||||
|
"first": ID + path + "?page=true",
|
||||||
|
"type": "OrderedCollection",
|
||||||
|
"totalItems": total_items,
|
||||||
|
}
|
||||||
|
|
||||||
|
q = db.query(model_cls).order_by(model_cls.created_at.desc()) # type: ignore
|
||||||
|
if next_cursor:
|
||||||
|
q = q.filter(model_cls.created_at < _decode_cursor(next_cursor)) # type: ignore
|
||||||
|
q = q.limit(20)
|
||||||
|
|
||||||
|
items = [followx for followx in q.all()]
|
||||||
|
next_cursor = None
|
||||||
|
if (
|
||||||
|
items
|
||||||
|
and db.query(model_cls)
|
||||||
|
.filter(model_cls.created_at < items[-1].created_at)
|
||||||
|
.count()
|
||||||
|
> 0
|
||||||
|
):
|
||||||
|
next_cursor = _encode_cursor(items[-1].created_at)
|
||||||
|
|
||||||
|
collection_page = {
|
||||||
|
"@context": ap.AS_CTX,
|
||||||
|
"id": (
|
||||||
|
ID + path + "?page=true"
|
||||||
|
if not next_cursor
|
||||||
|
else ID + path + f"?next_cursor={next_cursor}"
|
||||||
|
),
|
||||||
|
"partOf": ID + path,
|
||||||
|
"type": "OrderedCollectionPage",
|
||||||
|
"orderedItems": [item.ap_actor_id for item in items],
|
||||||
|
}
|
||||||
|
if next_cursor:
|
||||||
|
collection_page["next"] = ID + path + f"?next_cursor={next_cursor}"
|
||||||
|
|
||||||
|
return collection_page
|
||||||
|
|
||||||
|
|
||||||
|
def _encode_cursor(val: datetime) -> str:
|
||||||
|
return base64.urlsafe_b64encode(val.isoformat().encode()).decode()
|
||||||
|
|
||||||
|
|
||||||
|
def _decode_cursor(cursor: str) -> datetime:
|
||||||
|
return isoparse(base64.urlsafe_b64decode(cursor).decode())
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/followers")
|
||||||
|
def followers(
|
||||||
|
request: Request,
|
||||||
|
page: bool | None = None,
|
||||||
|
next_cursor: str | None = None,
|
||||||
|
prev_cursor: str | None = None,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_: httpsig.HTTPSigInfo = Depends(httpsig.httpsig_checker),
|
||||||
|
) -> ActivityPubResponse | templates.TemplateResponse:
|
||||||
|
if is_activitypub_requested(request):
|
||||||
|
return ActivityPubResponse(
|
||||||
|
_build_followx_collection(
|
||||||
|
db=db,
|
||||||
|
model_cls=models.Follower,
|
||||||
|
path="/followers",
|
||||||
|
page=page,
|
||||||
|
next_cursor=next_cursor,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
followers = (
|
||||||
|
db.query(models.Follower)
|
||||||
|
.options(joinedload(models.Follower.actor))
|
||||||
|
.order_by(models.Follower.created_at.desc())
|
||||||
|
.limit(20)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
# TODO: support next_cursor/prev_cursor
|
||||||
|
actors_metadata = {}
|
||||||
|
if is_current_user_admin(request):
|
||||||
|
actors_metadata = get_actors_metadata(
|
||||||
|
db,
|
||||||
|
[f.actor for f in followers],
|
||||||
|
)
|
||||||
|
|
||||||
|
return templates.render_template(
|
||||||
|
db,
|
||||||
|
request,
|
||||||
|
"followers.html",
|
||||||
|
{
|
||||||
|
"followers": followers,
|
||||||
|
"actors_metadata": actors_metadata,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/following")
|
||||||
|
def following(
|
||||||
|
request: Request,
|
||||||
|
page: bool | None = None,
|
||||||
|
next_cursor: str | None = None,
|
||||||
|
prev_cursor: str | None = None,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_: httpsig.HTTPSigInfo = Depends(httpsig.httpsig_checker),
|
||||||
|
) -> ActivityPubResponse | templates.TemplateResponse:
|
||||||
|
if is_activitypub_requested(request):
|
||||||
|
return ActivityPubResponse(
|
||||||
|
_build_followx_collection(
|
||||||
|
db=db,
|
||||||
|
model_cls=models.Following,
|
||||||
|
path="/following",
|
||||||
|
page=page,
|
||||||
|
next_cursor=next_cursor,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
q = (
|
||||||
|
db.query(models.Following)
|
||||||
|
.options(joinedload(models.Following.actor))
|
||||||
|
.order_by(models.Following.created_at.desc())
|
||||||
|
.limit(20)
|
||||||
|
)
|
||||||
|
following = q.all()
|
||||||
|
|
||||||
|
# TODO: support next_cursor/prev_cursor
|
||||||
|
actors_metadata = {}
|
||||||
|
if is_current_user_admin(request):
|
||||||
|
actors_metadata = get_actors_metadata(
|
||||||
|
db,
|
||||||
|
[f.actor for f in following],
|
||||||
|
)
|
||||||
|
|
||||||
|
return templates.render_template(
|
||||||
|
db,
|
||||||
|
request,
|
||||||
|
"following.html",
|
||||||
|
{
|
||||||
|
"following": following,
|
||||||
|
"actors_metadata": actors_metadata,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/outbox")
|
||||||
|
def outbox(
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_: httpsig.HTTPSigInfo = Depends(httpsig.httpsig_checker),
|
||||||
|
) -> ActivityPubResponse:
|
||||||
|
outbox_objects = (
|
||||||
|
db.query(models.OutboxObject)
|
||||||
|
.filter(
|
||||||
|
models.OutboxObject.visibility == ap.VisibilityEnum.PUBLIC,
|
||||||
|
models.OutboxObject.is_deleted.is_(False),
|
||||||
|
)
|
||||||
|
.order_by(models.OutboxObject.ap_published_at.desc())
|
||||||
|
.limit(20)
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
return ActivityPubResponse(
|
||||||
|
{
|
||||||
|
"@context": DEFAULT_CTX,
|
||||||
|
"id": f"{ID}/outbox",
|
||||||
|
"type": "OrderedCollection",
|
||||||
|
"totalItems": len(outbox_objects),
|
||||||
|
"orderedItems": [
|
||||||
|
ap.remove_context(ap.wrap_object_if_needed(a.ap_object))
|
||||||
|
for a in outbox_objects
|
||||||
|
],
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/o/{public_id}")
|
||||||
|
def outbox_by_public_id(
|
||||||
|
public_id: str,
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_: httpsig.HTTPSigInfo = Depends(httpsig.httpsig_checker),
|
||||||
|
) -> ActivityPubResponse | templates.TemplateResponse:
|
||||||
|
# TODO: ACL?
|
||||||
|
maybe_object = (
|
||||||
|
db.query(models.OutboxObject)
|
||||||
|
.filter(
|
||||||
|
models.OutboxObject.public_id == public_id,
|
||||||
|
# models.OutboxObject.is_deleted.is_(False),
|
||||||
|
)
|
||||||
|
.one_or_none()
|
||||||
|
)
|
||||||
|
if not maybe_object:
|
||||||
|
raise HTTPException(status_code=404)
|
||||||
|
#
|
||||||
|
if is_activitypub_requested(request):
|
||||||
|
return ActivityPubResponse(maybe_object.ap_object)
|
||||||
|
|
||||||
|
return templates.render_template(
|
||||||
|
db,
|
||||||
|
request,
|
||||||
|
"object.html",
|
||||||
|
{
|
||||||
|
"outbox_object": maybe_object,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/o/{public_id}/activity")
|
||||||
|
def outbox_activity_by_public_id(
|
||||||
|
public_id: str,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_: httpsig.HTTPSigInfo = Depends(httpsig.httpsig_checker),
|
||||||
|
) -> ActivityPubResponse:
|
||||||
|
# TODO: ACL?
|
||||||
|
maybe_object = (
|
||||||
|
db.query(models.OutboxObject)
|
||||||
|
.filter(models.OutboxObject.public_id == public_id)
|
||||||
|
.one_or_none()
|
||||||
|
)
|
||||||
|
if not maybe_object:
|
||||||
|
raise HTTPException(status_code=404)
|
||||||
|
|
||||||
|
return ActivityPubResponse(ap.wrap_object(maybe_object.ap_object))
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/t/{tag}")
|
||||||
|
def tag_by_name(
|
||||||
|
tag: str,
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
_: httpsig.HTTPSigInfo = Depends(httpsig.httpsig_checker),
|
||||||
|
) -> ActivityPubResponse | templates.TemplateResponse:
|
||||||
|
# TODO(ts): implement HTML version
|
||||||
|
# if is_activitypub_requested(request):
|
||||||
|
return ActivityPubResponse(
|
||||||
|
{
|
||||||
|
"@context": ap.AS_CTX,
|
||||||
|
"id": BASE_URL + f"/t/{tag}",
|
||||||
|
"type": "OrderedCollection",
|
||||||
|
"totalItems": 0,
|
||||||
|
"orderedItems": [],
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@app.post("/inbox")
|
||||||
|
async def inbox(
|
||||||
|
request: Request,
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
httpsig_info: httpsig.HTTPSigInfo = Depends(httpsig.enforce_httpsig),
|
||||||
|
) -> Response:
|
||||||
|
logger.info(f"headers={request.headers}")
|
||||||
|
payload = await request.json()
|
||||||
|
logger.info(f"{payload=}")
|
||||||
|
save_to_inbox(db, payload)
|
||||||
|
return Response(status_code=204)
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/.well-known/webfinger")
|
||||||
|
def wellknown_webfinger(resource: str) -> JSONResponse:
|
||||||
|
"""Exposes/servers WebFinger data."""
|
||||||
|
if resource not in [f"acct:{USERNAME}@{DOMAIN}", ID]:
|
||||||
|
raise HTTPException(status_code=404)
|
||||||
|
|
||||||
|
out = {
|
||||||
|
"subject": f"acct:{USERNAME}@{DOMAIN}",
|
||||||
|
"aliases": [ID],
|
||||||
|
"links": [
|
||||||
|
{
|
||||||
|
"rel": "http://webfinger.net/rel/profile-page",
|
||||||
|
"type": "text/html",
|
||||||
|
"href": ID,
|
||||||
|
},
|
||||||
|
{"rel": "self", "type": "application/activity+json", "href": ID},
|
||||||
|
{
|
||||||
|
"rel": "http://ostatus.org/schema/1.0/subscribe",
|
||||||
|
"template": DOMAIN + "/authorize_interaction?uri={uri}",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
return JSONResponse(out, media_type="application/jrd+json; charset=utf-8")
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/.well-known/nodeinfo")
|
||||||
|
async def well_known_nodeinfo() -> dict[str, Any]:
|
||||||
|
return {
|
||||||
|
"links": [
|
||||||
|
{
|
||||||
|
"rel": "http://nodeinfo.diaspora.software/ns/schema/2.1",
|
||||||
|
"href": f"{BASE_URL}/nodeinfo",
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/nodeinfo")
|
||||||
|
def nodeinfo(
|
||||||
|
db: Session = Depends(get_db),
|
||||||
|
):
|
||||||
|
local_posts = public_outbox_objects_count(db)
|
||||||
|
return JSONResponse(
|
||||||
|
{
|
||||||
|
"version": "2.1",
|
||||||
|
"software": {
|
||||||
|
"name": "microblogpub",
|
||||||
|
"version": config.VERSION,
|
||||||
|
"repository": "https://github.com/tsileo/microblog.pub",
|
||||||
|
},
|
||||||
|
"protocols": ["activitypub"],
|
||||||
|
"services": {"inbound": [], "outbound": []},
|
||||||
|
"openRegistrations": False,
|
||||||
|
"usage": {"users": {"total": 1}, "localPosts": local_posts},
|
||||||
|
"metadata": {
|
||||||
|
"nodeName": LOCAL_ACTOR.handle,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
media_type=(
|
||||||
|
"application/json; "
|
||||||
|
"profile=http://nodeinfo.diaspora.software/ns/schema/2.1#"
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
proxy_client = httpx.AsyncClient()
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/proxy/media/{encoded_url}")
|
||||||
|
async def serve_proxy_media(request: Request, encoded_url: str) -> StreamingResponse:
|
||||||
|
# Decode the base64-encoded URL
|
||||||
|
url = base64.urlsafe_b64decode(encoded_url).decode()
|
||||||
|
# Request the URL (and filter request headers)
|
||||||
|
proxy_req = proxy_client.build_request(
|
||||||
|
request.method,
|
||||||
|
url,
|
||||||
|
headers=[
|
||||||
|
(k, v)
|
||||||
|
for (k, v) in request.headers.raw
|
||||||
|
if k.lower()
|
||||||
|
not in [b"host", b"cookie", b"x-forwarded-for", b"x-real-ip", b"user-agent"]
|
||||||
|
]
|
||||||
|
+ [(b"user-agent", USER_AGENT.encode())],
|
||||||
|
)
|
||||||
|
proxy_resp = await proxy_client.send(proxy_req, stream=True)
|
||||||
|
# Filter the headers
|
||||||
|
proxy_resp_headers = [
|
||||||
|
(k, v)
|
||||||
|
for (k, v) in proxy_resp.headers.items()
|
||||||
|
if k.lower()
|
||||||
|
in [
|
||||||
|
"content-length",
|
||||||
|
"content-type",
|
||||||
|
"content-range",
|
||||||
|
"accept-ranges" "etag",
|
||||||
|
"cache-control",
|
||||||
|
"expires",
|
||||||
|
"date",
|
||||||
|
"last-modified",
|
||||||
|
]
|
||||||
|
]
|
||||||
|
return StreamingResponse(
|
||||||
|
proxy_resp.aiter_raw(),
|
||||||
|
status_code=proxy_resp.status_code,
|
||||||
|
headers=dict(proxy_resp_headers),
|
||||||
|
background=BackgroundTask(proxy_resp.aclose),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/robots.txt", response_class=PlainTextResponse)
|
||||||
|
async def robots_file():
|
||||||
|
return """User-agent: *
|
||||||
|
Disallow: /followers
|
||||||
|
Disallow: /following
|
||||||
|
Disallow: /admin"""
|
288
app/models.py
Normal file
288
app/models.py
Normal file
|
@ -0,0 +1,288 @@
|
||||||
|
import enum
|
||||||
|
from typing import Any
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from sqlalchemy import JSON
|
||||||
|
from sqlalchemy import Boolean
|
||||||
|
from sqlalchemy import Column
|
||||||
|
from sqlalchemy import DateTime
|
||||||
|
from sqlalchemy import Enum
|
||||||
|
from sqlalchemy import ForeignKey
|
||||||
|
from sqlalchemy import Integer
|
||||||
|
from sqlalchemy import String
|
||||||
|
from sqlalchemy import UniqueConstraint
|
||||||
|
from sqlalchemy.orm import Mapped
|
||||||
|
from sqlalchemy.orm import relationship
|
||||||
|
|
||||||
|
from app import activitypub as ap
|
||||||
|
from app.actor import LOCAL_ACTOR
|
||||||
|
from app.actor import Actor as BaseActor
|
||||||
|
from app.ap_object import Object as BaseObject
|
||||||
|
from app.database import Base
|
||||||
|
from app.database import now
|
||||||
|
|
||||||
|
|
||||||
|
class Actor(Base, BaseActor):
|
||||||
|
__tablename__ = "actors"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
created_at = Column(DateTime(timezone=True), nullable=False, default=now)
|
||||||
|
updated_at = Column(DateTime(timezone=True), nullable=False, default=now)
|
||||||
|
|
||||||
|
ap_id = Column(String, unique=True, nullable=False, index=True)
|
||||||
|
ap_actor: Mapped[ap.RawObject] = Column(JSON, nullable=False)
|
||||||
|
ap_type = Column(String, nullable=False)
|
||||||
|
|
||||||
|
handle = Column(String, nullable=True, index=True)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_from_db(self) -> bool:
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
class InboxObject(Base, BaseObject):
|
||||||
|
__tablename__ = "inbox"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
created_at = Column(DateTime(timezone=True), nullable=False, default=now)
|
||||||
|
updated_at = Column(DateTime(timezone=True), nullable=False, default=now)
|
||||||
|
|
||||||
|
actor_id = Column(Integer, ForeignKey("actors.id"), nullable=False)
|
||||||
|
actor: Mapped[Actor] = relationship(Actor, uselist=False)
|
||||||
|
|
||||||
|
server = Column(String, nullable=False)
|
||||||
|
|
||||||
|
is_hidden_from_stream = Column(Boolean, nullable=False, default=False)
|
||||||
|
|
||||||
|
ap_actor_id = Column(String, nullable=False)
|
||||||
|
ap_type = Column(String, nullable=False)
|
||||||
|
ap_id = Column(String, nullable=False, unique=True, index=True)
|
||||||
|
ap_context = Column(String, nullable=True)
|
||||||
|
ap_published_at = Column(DateTime(timezone=True), nullable=False)
|
||||||
|
ap_object: Mapped[ap.RawObject] = Column(JSON, nullable=False)
|
||||||
|
|
||||||
|
activity_object_ap_id = Column(String, nullable=True)
|
||||||
|
|
||||||
|
visibility = Column(Enum(ap.VisibilityEnum), nullable=False)
|
||||||
|
|
||||||
|
# Used for Like, Announce and Undo activities
|
||||||
|
relates_to_inbox_object_id = Column(
|
||||||
|
Integer,
|
||||||
|
ForeignKey("inbox.id"),
|
||||||
|
nullable=True,
|
||||||
|
)
|
||||||
|
relates_to_inbox_object: Mapped[Optional["InboxObject"]] = relationship(
|
||||||
|
"InboxObject",
|
||||||
|
foreign_keys=relates_to_inbox_object_id,
|
||||||
|
remote_side=id,
|
||||||
|
uselist=False,
|
||||||
|
)
|
||||||
|
relates_to_outbox_object_id = Column(
|
||||||
|
Integer,
|
||||||
|
ForeignKey("outbox.id"),
|
||||||
|
nullable=True,
|
||||||
|
)
|
||||||
|
relates_to_outbox_object: Mapped[Optional["OutboxObject"]] = relationship(
|
||||||
|
"OutboxObject",
|
||||||
|
foreign_keys=[relates_to_outbox_object_id],
|
||||||
|
uselist=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
undone_by_inbox_object_id = Column(Integer, ForeignKey("inbox.id"), nullable=True)
|
||||||
|
|
||||||
|
# Link the oubox AP ID to allow undo without any extra query
|
||||||
|
liked_via_outbox_object_ap_id = Column(String, nullable=True)
|
||||||
|
announced_via_outbox_object_ap_id = Column(String, nullable=True)
|
||||||
|
|
||||||
|
is_bookmarked = Column(Boolean, nullable=False, default=False)
|
||||||
|
|
||||||
|
# FIXME(ts): do we need this?
|
||||||
|
has_replies = Column(Boolean, nullable=False, default=False)
|
||||||
|
|
||||||
|
og_meta: Mapped[list[dict[str, Any]] | None] = Column(JSON, nullable=True)
|
||||||
|
|
||||||
|
|
||||||
|
class OutboxObject(Base, BaseObject):
|
||||||
|
__tablename__ = "outbox"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
created_at = Column(DateTime(timezone=True), nullable=False, default=now)
|
||||||
|
updated_at = Column(DateTime(timezone=True), nullable=False, default=now)
|
||||||
|
|
||||||
|
is_hidden_from_homepage = Column(Boolean, nullable=False, default=False)
|
||||||
|
|
||||||
|
public_id = Column(String, nullable=False, index=True)
|
||||||
|
|
||||||
|
ap_type = Column(String, nullable=False)
|
||||||
|
ap_id = Column(String, nullable=False, unique=True, index=True)
|
||||||
|
ap_context = Column(String, nullable=True)
|
||||||
|
ap_object: Mapped[ap.RawObject] = Column(JSON, nullable=False)
|
||||||
|
|
||||||
|
activity_object_ap_id = Column(String, nullable=True)
|
||||||
|
|
||||||
|
# Source content for activities (like Notes)
|
||||||
|
source = Column(String, nullable=True)
|
||||||
|
|
||||||
|
ap_published_at = Column(DateTime(timezone=True), nullable=False, default=now)
|
||||||
|
visibility = Column(Enum(ap.VisibilityEnum), nullable=False)
|
||||||
|
|
||||||
|
likes_count = Column(Integer, nullable=False, default=0)
|
||||||
|
announces_count = Column(Integer, nullable=False, default=0)
|
||||||
|
replies_count = Column(Integer, nullable=False, default=0)
|
||||||
|
|
||||||
|
webmentions = Column(JSON, nullable=True)
|
||||||
|
|
||||||
|
og_meta: Mapped[list[dict[str, Any]] | None] = Column(JSON, nullable=True)
|
||||||
|
|
||||||
|
# Never actually delete from the outbox
|
||||||
|
is_deleted = Column(Boolean, nullable=False, default=False)
|
||||||
|
|
||||||
|
# Used for Like, Announce and Undo activities
|
||||||
|
relates_to_inbox_object_id = Column(
|
||||||
|
Integer,
|
||||||
|
ForeignKey("inbox.id"),
|
||||||
|
nullable=True,
|
||||||
|
)
|
||||||
|
relates_to_inbox_object: Mapped[Optional["InboxObject"]] = relationship(
|
||||||
|
"InboxObject",
|
||||||
|
foreign_keys=[relates_to_inbox_object_id],
|
||||||
|
uselist=False,
|
||||||
|
)
|
||||||
|
relates_to_outbox_object_id = Column(
|
||||||
|
Integer,
|
||||||
|
ForeignKey("outbox.id"),
|
||||||
|
nullable=True,
|
||||||
|
)
|
||||||
|
relates_to_outbox_object: Mapped[Optional["OutboxObject"]] = relationship(
|
||||||
|
"OutboxObject",
|
||||||
|
foreign_keys=[relates_to_outbox_object_id],
|
||||||
|
remote_side=id,
|
||||||
|
uselist=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
undone_by_outbox_object_id = Column(Integer, ForeignKey("outbox.id"), nullable=True)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def actor(self) -> BaseActor:
|
||||||
|
return LOCAL_ACTOR
|
||||||
|
|
||||||
|
|
||||||
|
class Follower(Base):
|
||||||
|
__tablename__ = "followers"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
created_at = Column(DateTime(timezone=True), nullable=False, default=now)
|
||||||
|
updated_at = Column(DateTime(timezone=True), nullable=False, default=now)
|
||||||
|
|
||||||
|
actor_id = Column(Integer, ForeignKey("actors.id"), nullable=False, unique=True)
|
||||||
|
actor = relationship(Actor, uselist=False)
|
||||||
|
|
||||||
|
inbox_object_id = Column(Integer, ForeignKey("inbox.id"), nullable=False)
|
||||||
|
inbox_object = relationship(InboxObject, uselist=False)
|
||||||
|
|
||||||
|
ap_actor_id = Column(String, nullable=False, unique=True)
|
||||||
|
|
||||||
|
|
||||||
|
class Following(Base):
|
||||||
|
__tablename__ = "following"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
created_at = Column(DateTime(timezone=True), nullable=False, default=now)
|
||||||
|
updated_at = Column(DateTime(timezone=True), nullable=False, default=now)
|
||||||
|
|
||||||
|
actor_id = Column(Integer, ForeignKey("actors.id"), nullable=False, unique=True)
|
||||||
|
actor = relationship(Actor, uselist=False)
|
||||||
|
|
||||||
|
outbox_object_id = Column(Integer, ForeignKey("outbox.id"), nullable=False)
|
||||||
|
outbox_object = relationship(OutboxObject, uselist=False)
|
||||||
|
|
||||||
|
ap_actor_id = Column(String, nullable=False, unique=True)
|
||||||
|
|
||||||
|
|
||||||
|
@enum.unique
|
||||||
|
class NotificationType(str, enum.Enum):
|
||||||
|
NEW_FOLLOWER = "new_follower"
|
||||||
|
UNFOLLOW = "unfollow"
|
||||||
|
LIKE = "like"
|
||||||
|
UNDO_LIKE = "undo_like"
|
||||||
|
ANNOUNCE = "announce"
|
||||||
|
UNDO_ANNOUNCE = "undo_announce"
|
||||||
|
|
||||||
|
# TODO:
|
||||||
|
MENTION = "mention"
|
||||||
|
|
||||||
|
|
||||||
|
class Notification(Base):
|
||||||
|
__tablename__ = "notifications"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
created_at = Column(DateTime(timezone=True), nullable=False, default=now)
|
||||||
|
notification_type = Column(Enum(NotificationType), nullable=True)
|
||||||
|
is_new = Column(Boolean, nullable=False, default=True)
|
||||||
|
|
||||||
|
actor_id = Column(Integer, ForeignKey("actors.id"), nullable=True)
|
||||||
|
actor = relationship(Actor, uselist=False)
|
||||||
|
|
||||||
|
outbox_object_id = Column(Integer, ForeignKey("outbox.id"), nullable=True)
|
||||||
|
outbox_object = relationship(OutboxObject, uselist=False)
|
||||||
|
|
||||||
|
inbox_object_id = Column(Integer, ForeignKey("inbox.id"), nullable=True)
|
||||||
|
inbox_object = relationship(InboxObject, uselist=False)
|
||||||
|
|
||||||
|
|
||||||
|
class OutgoingActivity(Base):
|
||||||
|
__tablename__ = "outgoing_activities"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
created_at = Column(DateTime(timezone=True), nullable=False, default=now)
|
||||||
|
|
||||||
|
recipient = Column(String, nullable=False)
|
||||||
|
outbox_object_id = Column(Integer, ForeignKey("outbox.id"), nullable=False)
|
||||||
|
outbox_object = relationship(OutboxObject, uselist=False)
|
||||||
|
|
||||||
|
tries = Column(Integer, nullable=False, default=0)
|
||||||
|
next_try = Column(DateTime(timezone=True), nullable=True, default=now)
|
||||||
|
|
||||||
|
last_try = Column(DateTime(timezone=True), nullable=True)
|
||||||
|
last_status_code = Column(Integer, nullable=True)
|
||||||
|
last_response = Column(String, nullable=True)
|
||||||
|
|
||||||
|
is_sent = Column(Boolean, nullable=False, default=False)
|
||||||
|
is_errored = Column(Boolean, nullable=False, default=False)
|
||||||
|
error = Column(String, nullable=True)
|
||||||
|
|
||||||
|
|
||||||
|
class TaggedOutboxObject(Base):
|
||||||
|
__tablename__ = "tagged_outbox_objects"
|
||||||
|
__table_args__ = (
|
||||||
|
UniqueConstraint("outbox_object_id", "tag", name="uix_tagged_object"),
|
||||||
|
)
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
|
||||||
|
outbox_object_id = Column(Integer, ForeignKey("outbox.id"), nullable=False)
|
||||||
|
outbox_object = relationship(OutboxObject, uselist=False)
|
||||||
|
|
||||||
|
tag = Column(String, nullable=False, index=True)
|
||||||
|
|
||||||
|
|
||||||
|
"""
|
||||||
|
class Upload(Base):
|
||||||
|
__tablename__ = "upload"
|
||||||
|
|
||||||
|
filename = Column(String, nullable=False)
|
||||||
|
filehash = Column(String, nullable=False)
|
||||||
|
filesize = Column(Integer, nullable=False)
|
||||||
|
|
||||||
|
|
||||||
|
class OutboxObjectAttachment(Base):
|
||||||
|
__tablename__ = "outbox_object_attachment"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
|
||||||
|
outbox_object_id = Column(Integer, ForeignKey("outbox.id"), nullable=False)
|
||||||
|
outbox_object = relationship(OutboxObject, uselist=False)
|
||||||
|
|
||||||
|
upload_id = Column(Integer, ForeignKey("upload.id"))
|
||||||
|
upload = relationship(Upload, uselist=False)
|
||||||
|
"""
|
90
app/opengraph.py
Normal file
90
app/opengraph.py
Normal file
|
@ -0,0 +1,90 @@
|
||||||
|
import mimetypes
|
||||||
|
import re
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
|
||||||
|
import httpx
|
||||||
|
from bs4 import BeautifulSoup # type: ignore
|
||||||
|
from pydantic import BaseModel
|
||||||
|
|
||||||
|
from app import activitypub as ap
|
||||||
|
from app import config
|
||||||
|
from app.urlutils import is_url_valid
|
||||||
|
|
||||||
|
|
||||||
|
class OpenGraphMeta(BaseModel):
|
||||||
|
url: str
|
||||||
|
title: str
|
||||||
|
image: str
|
||||||
|
description: str
|
||||||
|
site_name: str
|
||||||
|
|
||||||
|
|
||||||
|
def _scrap_og_meta(html: str) -> OpenGraphMeta | None:
|
||||||
|
soup = BeautifulSoup(html, "html5lib")
|
||||||
|
ogs = {
|
||||||
|
og.attrs["property"]: og.attrs.get("content")
|
||||||
|
for og in soup.html.head.findAll(property=re.compile(r"^og"))
|
||||||
|
}
|
||||||
|
raw = {}
|
||||||
|
for field in OpenGraphMeta.__fields__.keys():
|
||||||
|
og_field = f"og:{field}"
|
||||||
|
if not ogs.get(og_field):
|
||||||
|
return None
|
||||||
|
|
||||||
|
raw[field] = ogs[og_field]
|
||||||
|
|
||||||
|
return OpenGraphMeta.parse_obj(raw)
|
||||||
|
|
||||||
|
|
||||||
|
def _urls_from_note(note: ap.RawObject) -> set[str]:
|
||||||
|
note_host = urlparse(ap.get_id(note["id"]) or "").netloc
|
||||||
|
|
||||||
|
urls = set()
|
||||||
|
if "content" in note:
|
||||||
|
soup = BeautifulSoup(note["content"], "html5lib")
|
||||||
|
for link in soup.find_all("a"):
|
||||||
|
h = link.get("href")
|
||||||
|
ph = urlparse(h)
|
||||||
|
mimetype, _ = mimetypes.guess_type(h)
|
||||||
|
if (
|
||||||
|
ph.scheme in {"http", "https"}
|
||||||
|
and ph.netloc != note_host
|
||||||
|
and is_url_valid(h)
|
||||||
|
and (
|
||||||
|
not mimetype
|
||||||
|
or mimetype.split("/")[0] in ["image", "video", "audio"]
|
||||||
|
)
|
||||||
|
):
|
||||||
|
urls.add(h)
|
||||||
|
|
||||||
|
return urls
|
||||||
|
|
||||||
|
|
||||||
|
def _og_meta_from_url(url: str) -> OpenGraphMeta | None:
|
||||||
|
resp = httpx.get(
|
||||||
|
url,
|
||||||
|
headers={
|
||||||
|
"User-Agent": config.USER_AGENT,
|
||||||
|
},
|
||||||
|
follow_redirects=True,
|
||||||
|
)
|
||||||
|
resp.raise_for_status()
|
||||||
|
|
||||||
|
if not (ct := resp.headers.get("content-type")) or not ct.startswith("text/html"):
|
||||||
|
return None
|
||||||
|
|
||||||
|
return _scrap_og_meta(resp.text)
|
||||||
|
|
||||||
|
|
||||||
|
def og_meta_from_note(note: ap.RawObject) -> list[OpenGraphMeta]:
|
||||||
|
og_meta = []
|
||||||
|
urls = _urls_from_note(note)
|
||||||
|
for url in urls:
|
||||||
|
try:
|
||||||
|
maybe_og_meta = _og_meta_from_url(url)
|
||||||
|
if maybe_og_meta:
|
||||||
|
og_meta.append(maybe_og_meta)
|
||||||
|
except httpx.HTTPError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
return og_meta
|
138
app/process_outgoing_activities.py
Normal file
138
app/process_outgoing_activities.py
Normal file
|
@ -0,0 +1,138 @@
|
||||||
|
import email
|
||||||
|
import time
|
||||||
|
import traceback
|
||||||
|
from datetime import datetime
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
import httpx
|
||||||
|
from loguru import logger
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
|
||||||
|
from app import activitypub as ap
|
||||||
|
from app import models
|
||||||
|
from app.database import SessionLocal
|
||||||
|
from app.database import now
|
||||||
|
|
||||||
|
_MAX_RETRIES = 16
|
||||||
|
|
||||||
|
|
||||||
|
def new_outgoing_activity(
|
||||||
|
db: Session,
|
||||||
|
recipient: str,
|
||||||
|
outbox_object_id: int,
|
||||||
|
) -> models.OutgoingActivity:
|
||||||
|
outgoing_activity = models.OutgoingActivity(
|
||||||
|
recipient=recipient,
|
||||||
|
outbox_object_id=outbox_object_id,
|
||||||
|
)
|
||||||
|
|
||||||
|
db.add(outgoing_activity)
|
||||||
|
db.commit()
|
||||||
|
db.refresh(outgoing_activity)
|
||||||
|
return outgoing_activity
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_retry_after(retry_after: str) -> datetime | None:
|
||||||
|
try:
|
||||||
|
# Retry-After: 120
|
||||||
|
seconds = int(retry_after)
|
||||||
|
except ValueError:
|
||||||
|
# Retry-After: Wed, 21 Oct 2015 07:28:00 GMT
|
||||||
|
dt_tuple = email.utils.parsedate_tz(retry_after)
|
||||||
|
if dt_tuple is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
seconds = int(email.utils.mktime_tz(dt_tuple) - time.time())
|
||||||
|
|
||||||
|
return now() + timedelta(seconds=seconds)
|
||||||
|
|
||||||
|
|
||||||
|
def _exp_backoff(tries: int) -> datetime:
|
||||||
|
seconds = 2 * (2 ** (tries - 1))
|
||||||
|
return now() + timedelta(seconds=seconds)
|
||||||
|
|
||||||
|
|
||||||
|
def _set_next_try(
|
||||||
|
outgoing_activity: models.OutgoingActivity,
|
||||||
|
next_try: datetime | None = None,
|
||||||
|
) -> None:
|
||||||
|
if not outgoing_activity.tries:
|
||||||
|
raise ValueError("Should never happen")
|
||||||
|
|
||||||
|
if outgoing_activity.tries == _MAX_RETRIES:
|
||||||
|
outgoing_activity.is_errored = True
|
||||||
|
outgoing_activity.next_try = None
|
||||||
|
else:
|
||||||
|
outgoing_activity.next_try = next_try or _exp_backoff(outgoing_activity.tries)
|
||||||
|
|
||||||
|
|
||||||
|
def process_next_outgoing_activity(db: Session) -> bool:
|
||||||
|
q = (
|
||||||
|
db.query(models.OutgoingActivity)
|
||||||
|
.filter(
|
||||||
|
models.OutgoingActivity.next_try <= now(),
|
||||||
|
models.OutgoingActivity.is_errored.is_(False),
|
||||||
|
models.OutgoingActivity.is_sent.is_(False),
|
||||||
|
)
|
||||||
|
.order_by(models.OutgoingActivity.next_try)
|
||||||
|
)
|
||||||
|
q_count = q.count()
|
||||||
|
logger.info(f"{q_count} outgoing activities ready to process")
|
||||||
|
if not q_count:
|
||||||
|
logger.info("No activities to process")
|
||||||
|
return False
|
||||||
|
|
||||||
|
next_activity = q.limit(1).one()
|
||||||
|
|
||||||
|
next_activity.tries = next_activity.tries + 1
|
||||||
|
next_activity.last_try = now()
|
||||||
|
|
||||||
|
payload = ap.wrap_object_if_needed(next_activity.outbox_object.ap_object)
|
||||||
|
logger.info(f"{payload=}")
|
||||||
|
try:
|
||||||
|
resp = ap.post(next_activity.recipient, payload)
|
||||||
|
except httpx.HTTPStatusError as http_error:
|
||||||
|
logger.exception("Failed")
|
||||||
|
next_activity.last_status_code = http_error.response.status_code
|
||||||
|
next_activity.last_response = http_error.response.text
|
||||||
|
next_activity.error = traceback.format_exc()
|
||||||
|
|
||||||
|
if http_error.response.status_code in [429, 503]:
|
||||||
|
retry_after: datetime | None = None
|
||||||
|
if retry_after_value := http_error.response.headers.get("Retry-After"):
|
||||||
|
retry_after = _parse_retry_after(retry_after_value)
|
||||||
|
_set_next_try(next_activity, retry_after)
|
||||||
|
elif 400 <= http_error.response.status_code < 500:
|
||||||
|
logger.info(f"status_code={http_error.response.status_code} not retrying")
|
||||||
|
next_activity.is_errored = True
|
||||||
|
next_activity.next_try = None
|
||||||
|
else:
|
||||||
|
_set_next_try(next_activity)
|
||||||
|
except Exception:
|
||||||
|
logger.exception("Failed")
|
||||||
|
next_activity.error = traceback.format_exc()
|
||||||
|
_set_next_try(next_activity)
|
||||||
|
else:
|
||||||
|
logger.info("Success")
|
||||||
|
next_activity.is_sent = True
|
||||||
|
next_activity.last_status_code = resp.status_code
|
||||||
|
next_activity.last_response = resp.text
|
||||||
|
|
||||||
|
db.commit()
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def loop() -> None:
|
||||||
|
db = SessionLocal()
|
||||||
|
while 1:
|
||||||
|
try:
|
||||||
|
process_next_outgoing_activity(db)
|
||||||
|
except Exception:
|
||||||
|
logger.exception("Failed to process next outgoing activity")
|
||||||
|
raise
|
||||||
|
|
||||||
|
time.sleep(1)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
loop()
|
81
app/source.py
Normal file
81
app/source.py
Normal file
|
@ -0,0 +1,81 @@
|
||||||
|
import re
|
||||||
|
|
||||||
|
from markdown import markdown
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
|
||||||
|
from app import models
|
||||||
|
from app import webfinger
|
||||||
|
from app.actor import fetch_actor
|
||||||
|
from app.config import BASE_URL
|
||||||
|
|
||||||
|
|
||||||
|
def _set_a_attrs(attrs, new=False):
|
||||||
|
attrs[(None, "target")] = "_blank"
|
||||||
|
attrs[(None, "class")] = "external"
|
||||||
|
attrs[(None, "rel")] = "noopener"
|
||||||
|
attrs[(None, "title")] = attrs[(None, "href")]
|
||||||
|
return attrs
|
||||||
|
|
||||||
|
|
||||||
|
_HASHTAG_REGEX = re.compile(r"(#[\d\w]+)")
|
||||||
|
_MENTION_REGEX = re.compile(r"@[\d\w_.+-]+@[\d\w-]+\.[\d\w\-.]+")
|
||||||
|
|
||||||
|
|
||||||
|
def _hashtagify(db: Session, content: str) -> tuple[str, list[dict[str, str]]]:
|
||||||
|
tags = []
|
||||||
|
hashtags = re.findall(_HASHTAG_REGEX, content)
|
||||||
|
hashtags = sorted(set(hashtags), reverse=True) # unique tags, longest first
|
||||||
|
for hashtag in hashtags:
|
||||||
|
tag = hashtag[1:]
|
||||||
|
link = f'<a href="{BASE_URL}/t/{tag}" class="mention hashtag" rel="tag">#<span>{tag}</span></a>' # noqa: E501
|
||||||
|
tags.append(dict(href=f"{BASE_URL}/t/{tag}", name=hashtag, type="Hashtag"))
|
||||||
|
content = content.replace(hashtag, link)
|
||||||
|
return content, tags
|
||||||
|
|
||||||
|
|
||||||
|
def _mentionify(
|
||||||
|
db: Session, content: str, hide_domain: bool = False
|
||||||
|
) -> tuple[str, list[dict[str, str]]]:
|
||||||
|
tags = []
|
||||||
|
for mention in re.findall(_MENTION_REGEX, content):
|
||||||
|
_, username, domain = mention.split("@")
|
||||||
|
actor = (
|
||||||
|
db.query(models.Actor).filter(models.Actor.handle == mention).one_or_none()
|
||||||
|
)
|
||||||
|
if not actor:
|
||||||
|
actor_url = webfinger.get_actor_url(mention)
|
||||||
|
if not actor_url:
|
||||||
|
# FIXME(ts): raise an error?
|
||||||
|
continue
|
||||||
|
actor = fetch_actor(db, actor_url)
|
||||||
|
|
||||||
|
tags.append(dict(type="Mention", href=actor.url, name=mention))
|
||||||
|
|
||||||
|
d = f"@{domain}"
|
||||||
|
if hide_domain:
|
||||||
|
d = ""
|
||||||
|
|
||||||
|
link = f'<span class="h-card"><a href="{actor.url}" class="u-url mention">@<span>{username}</span>{d}</a></span>' # noqa: E501
|
||||||
|
content = content.replace(mention, link)
|
||||||
|
return content, tags
|
||||||
|
|
||||||
|
|
||||||
|
def markdownify(
|
||||||
|
db: Session,
|
||||||
|
content: str,
|
||||||
|
mentionify: bool = True,
|
||||||
|
hashtagify: bool = True,
|
||||||
|
) -> tuple[str, list[dict[str, str]]]:
|
||||||
|
"""
|
||||||
|
>>> content, tags = markdownify("Hello")
|
||||||
|
|
||||||
|
"""
|
||||||
|
tags = []
|
||||||
|
if hashtagify:
|
||||||
|
content, hashtag_tags = _hashtagify(db, content)
|
||||||
|
tags.extend(hashtag_tags)
|
||||||
|
if mentionify:
|
||||||
|
content, mention_tags = _mentionify(db, content)
|
||||||
|
tags.extend(mention_tags)
|
||||||
|
content = markdown(content, extensions=["mdx_linkify"])
|
||||||
|
return content, tags
|
1
app/static/css/.gitignore
vendored
Normal file
1
app/static/css/.gitignore
vendored
Normal file
|
@ -0,0 +1 @@
|
||||||
|
*.css
|
BIN
app/static/nopic.png
Normal file
BIN
app/static/nopic.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 1.7 KiB |
190
app/templates.py
Normal file
190
app/templates.py
Normal file
|
@ -0,0 +1,190 @@
|
||||||
|
import base64
|
||||||
|
from datetime import datetime
|
||||||
|
from datetime import timezone
|
||||||
|
from functools import lru_cache
|
||||||
|
from typing import Any
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
|
||||||
|
import bleach
|
||||||
|
import timeago # type: ignore
|
||||||
|
from bs4 import BeautifulSoup # type: ignore
|
||||||
|
from fastapi import Request
|
||||||
|
from fastapi.templating import Jinja2Templates
|
||||||
|
from sqlalchemy.orm import Session
|
||||||
|
from starlette.templating import _TemplateResponse as TemplateResponse
|
||||||
|
|
||||||
|
from app import models
|
||||||
|
from app.actor import LOCAL_ACTOR
|
||||||
|
from app.ap_object import Attachment
|
||||||
|
from app.boxes import public_outbox_objects_count
|
||||||
|
from app.config import DEBUG
|
||||||
|
from app.config import DOMAIN
|
||||||
|
from app.config import VERSION
|
||||||
|
from app.config import generate_csrf_token
|
||||||
|
from app.config import session_serializer
|
||||||
|
from app.database import now
|
||||||
|
from app.highlight import HIGHLIGHT_CSS
|
||||||
|
from app.highlight import highlight
|
||||||
|
|
||||||
|
_templates = Jinja2Templates(directory="app/templates")
|
||||||
|
|
||||||
|
|
||||||
|
def _filter_domain(text: str) -> str:
|
||||||
|
hostname = urlparse(text).hostname
|
||||||
|
if not hostname:
|
||||||
|
raise ValueError(f"No hostname for {text}")
|
||||||
|
return hostname
|
||||||
|
|
||||||
|
|
||||||
|
def _media_proxy_url(url: str | None) -> str:
|
||||||
|
if not url:
|
||||||
|
return "/static/nopic.png"
|
||||||
|
|
||||||
|
if url.startswith(DOMAIN):
|
||||||
|
return url
|
||||||
|
|
||||||
|
encoded_url = base64.urlsafe_b64encode(url.encode()).decode()
|
||||||
|
return f"/proxy/media/{encoded_url}"
|
||||||
|
|
||||||
|
|
||||||
|
def is_current_user_admin(request: Request) -> bool:
|
||||||
|
is_admin = False
|
||||||
|
session_cookie = request.cookies.get("session")
|
||||||
|
if session_cookie:
|
||||||
|
try:
|
||||||
|
loaded_session = session_serializer.loads(
|
||||||
|
session_cookie,
|
||||||
|
max_age=3600 * 12,
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
is_admin = loaded_session.get("is_logged_in")
|
||||||
|
|
||||||
|
return is_admin
|
||||||
|
|
||||||
|
|
||||||
|
def render_template(
|
||||||
|
db: Session,
|
||||||
|
request: Request,
|
||||||
|
template: str,
|
||||||
|
template_args: dict[str, Any] = {},
|
||||||
|
) -> TemplateResponse:
|
||||||
|
is_admin = False
|
||||||
|
is_admin = is_current_user_admin(request)
|
||||||
|
|
||||||
|
return _templates.TemplateResponse(
|
||||||
|
template,
|
||||||
|
{
|
||||||
|
"request": request,
|
||||||
|
"debug": DEBUG,
|
||||||
|
"microblogpub_version": VERSION,
|
||||||
|
"is_admin": is_admin,
|
||||||
|
"csrf_token": generate_csrf_token() if is_admin else None,
|
||||||
|
"highlight_css": HIGHLIGHT_CSS,
|
||||||
|
"notifications_count": db.query(models.Notification)
|
||||||
|
.filter(models.Notification.is_new.is_(True))
|
||||||
|
.count()
|
||||||
|
if is_admin
|
||||||
|
else 0,
|
||||||
|
"local_actor": LOCAL_ACTOR,
|
||||||
|
"followers_count": db.query(models.Follower).count(),
|
||||||
|
"following_count": db.query(models.Following).count(),
|
||||||
|
"objects_count": public_outbox_objects_count(db),
|
||||||
|
**template_args,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# HTML/templates helper
|
||||||
|
ALLOWED_TAGS = [
|
||||||
|
"a",
|
||||||
|
"abbr",
|
||||||
|
"acronym",
|
||||||
|
"b",
|
||||||
|
"br",
|
||||||
|
"blockquote",
|
||||||
|
"code",
|
||||||
|
"pre",
|
||||||
|
"em",
|
||||||
|
"i",
|
||||||
|
"li",
|
||||||
|
"ol",
|
||||||
|
"strong",
|
||||||
|
"sup",
|
||||||
|
"sub",
|
||||||
|
"del",
|
||||||
|
"ul",
|
||||||
|
"span",
|
||||||
|
"div",
|
||||||
|
"p",
|
||||||
|
"h1",
|
||||||
|
"h2",
|
||||||
|
"h3",
|
||||||
|
"h4",
|
||||||
|
"h5",
|
||||||
|
"h6",
|
||||||
|
"table",
|
||||||
|
"th",
|
||||||
|
"tr",
|
||||||
|
"td",
|
||||||
|
"thead",
|
||||||
|
"tbody",
|
||||||
|
"tfoot",
|
||||||
|
"colgroup",
|
||||||
|
"caption",
|
||||||
|
"img",
|
||||||
|
]
|
||||||
|
|
||||||
|
ALLOWED_ATTRIBUTES = {
|
||||||
|
"a": ["href", "title"],
|
||||||
|
"abbr": ["title"],
|
||||||
|
"acronym": ["title"],
|
||||||
|
"img": ["src", "alt", "title"],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@lru_cache(maxsize=256)
|
||||||
|
def _update_inline_imgs(content):
|
||||||
|
soup = BeautifulSoup(content, "html5lib")
|
||||||
|
imgs = soup.find_all("img")
|
||||||
|
if not imgs:
|
||||||
|
return content
|
||||||
|
|
||||||
|
for img in imgs:
|
||||||
|
if not img.attrs.get("src"):
|
||||||
|
continue
|
||||||
|
|
||||||
|
img.attrs["src"] = _media_proxy_url(img.attrs["src"])
|
||||||
|
|
||||||
|
return soup.find("body").decode_contents()
|
||||||
|
|
||||||
|
|
||||||
|
def _clean_html(html: str) -> str:
|
||||||
|
try:
|
||||||
|
return bleach.clean(
|
||||||
|
_update_inline_imgs(highlight(html)),
|
||||||
|
tags=ALLOWED_TAGS,
|
||||||
|
attributes=ALLOWED_ATTRIBUTES,
|
||||||
|
strip=True,
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
def _timeago(original_dt: datetime) -> str:
|
||||||
|
dt = original_dt
|
||||||
|
if dt.tzinfo:
|
||||||
|
dt = dt.astimezone(timezone.utc).replace(tzinfo=None)
|
||||||
|
return timeago.format(dt, now().replace(tzinfo=None))
|
||||||
|
|
||||||
|
|
||||||
|
def _has_media_type(attachment: Attachment, media_type_prefix: str) -> bool:
|
||||||
|
return attachment.media_type.startswith(media_type_prefix)
|
||||||
|
|
||||||
|
|
||||||
|
_templates.env.filters["domain"] = _filter_domain
|
||||||
|
_templates.env.filters["media_proxy_url"] = _media_proxy_url
|
||||||
|
_templates.env.filters["clean_html"] = _clean_html
|
||||||
|
_templates.env.filters["timeago"] = _timeago
|
||||||
|
_templates.env.filters["has_media_type"] = _has_media_type
|
13
app/templates/admin_new.html
Normal file
13
app/templates/admin_new.html
Normal file
|
@ -0,0 +1,13 @@
|
||||||
|
{%- import "utils.html" as utils with context -%}
|
||||||
|
{% extends "layout.html" %}
|
||||||
|
{% block content %}
|
||||||
|
|
||||||
|
<form action="{{ request.url_for("admin_actions_new") }}" enctype="multipart/form-data" method="POST">
|
||||||
|
{{ utils.embed_csrf_token() }}
|
||||||
|
{{ utils.embed_redirect_url() }}
|
||||||
|
<textarea name="content" rows="10" cols="50" autofocus="autofocus" designMode="on" style="font-size:1.2em;width:95%;"></textarea>
|
||||||
|
<input name="files" type="file" multiple>
|
||||||
|
<input type="submit" value="Publish">
|
||||||
|
</form>
|
||||||
|
|
||||||
|
{% endblock %}
|
25
app/templates/admin_stream.html
Normal file
25
app/templates/admin_stream.html
Normal file
|
@ -0,0 +1,25 @@
|
||||||
|
{%- import "utils.html" as utils with context -%}
|
||||||
|
{% extends "layout.html" %}
|
||||||
|
{% block content %}
|
||||||
|
|
||||||
|
{% for inbox_object in stream %}
|
||||||
|
{% if inbox_object.ap_type == "Announce" %}
|
||||||
|
{% if inbox_object.relates_to_inbox_object_id %}
|
||||||
|
{{ utils.display_object(inbox_object.relates_to_inbox_object) }}
|
||||||
|
{% else %}
|
||||||
|
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
{% else %}
|
||||||
|
{{ utils.display_object(inbox_object) }}
|
||||||
|
{% if inbox_object.liked_via_outbox_object_ap_id %}
|
||||||
|
{{ utils.admin_undo_button(inbox_object.liked_via_outbox_object_ap_id, "Unlike") }}
|
||||||
|
{% else %}
|
||||||
|
{{ utils.admin_like_button(inbox_object.ap_id) }}
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
{{ utils.admin_announce_button(inbox_object.ap_id) }}
|
||||||
|
{% endif %}
|
||||||
|
{% endfor %}
|
||||||
|
|
||||||
|
{% endblock %}
|
12
app/templates/followers.html
Normal file
12
app/templates/followers.html
Normal file
|
@ -0,0 +1,12 @@
|
||||||
|
{%- import "utils.html" as utils -%}
|
||||||
|
{% extends "layout.html" %}
|
||||||
|
{% block content %}
|
||||||
|
{% include "header.html" %}
|
||||||
|
<div id="followers">
|
||||||
|
<ul>
|
||||||
|
{% for follower in followers %}
|
||||||
|
<li>{{ utils.display_actor(follower.actor, actors_metadata) }}</li>
|
||||||
|
{% endfor %}
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
{% endblock %}
|
12
app/templates/following.html
Normal file
12
app/templates/following.html
Normal file
|
@ -0,0 +1,12 @@
|
||||||
|
{%- import "utils.html" as utils with context -%}
|
||||||
|
{% extends "layout.html" %}
|
||||||
|
{% block content %}
|
||||||
|
{% include "header.html" %}
|
||||||
|
<div id="following">
|
||||||
|
<ul>
|
||||||
|
{% for follow in following %}
|
||||||
|
<li>{{ utils.display_actor(follow.actor, actors_metadata) }}</li>
|
||||||
|
{% endfor %}
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
{% endblock %}
|
31
app/templates/header.html
Normal file
31
app/templates/header.html
Normal file
|
@ -0,0 +1,31 @@
|
||||||
|
<header id="header">
|
||||||
|
|
||||||
|
<div class="h-card p-author">
|
||||||
|
<data class="u-photo" value="{{ local_actor.icon_url }}"></data>
|
||||||
|
<a href="{{ local_actor.url }}" class="u-url u-uid no-hover title">
|
||||||
|
<span style="font-size:1.1em;">{{ local_actor.name }}</span>
|
||||||
|
<span style="font-size:0.85em;" class="subtitle-username">{{ local_actor.handle }}</span>
|
||||||
|
</a>
|
||||||
|
|
||||||
|
<div class="p-note summary">
|
||||||
|
{{ local_actor.summary | safe }}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{%- macro header_link(url, text) -%}
|
||||||
|
{% set url_for = request.url_for(url) %}
|
||||||
|
<a href="{{ url_for }}" {% if request.url == url_for %}class="active"{% endif %}>{{ text }}</a>
|
||||||
|
{% endmacro %}
|
||||||
|
|
||||||
|
<div style="margin:30px 0;">
|
||||||
|
<nav class="flexbox">
|
||||||
|
<ul>
|
||||||
|
<li>{{ header_link("index", "Notes") }} <span>{{ objects_count }}</span></li>
|
||||||
|
<li>{{ header_link("followers", "Followers") }} <span>{{ followers_count }}</span></li>
|
||||||
|
<li>{{ header_link("following", "Following") }} <span>{{ following_count }}</span></li>
|
||||||
|
</ul>
|
||||||
|
</nav>
|
||||||
|
|
||||||
|
|
||||||
|
</header>
|
14
app/templates/index.html
Normal file
14
app/templates/index.html
Normal file
|
@ -0,0 +1,14 @@
|
||||||
|
{%- import "utils.html" as utils with context -%}
|
||||||
|
{% extends "layout.html" %}
|
||||||
|
{% block content %}
|
||||||
|
{% include "header.html" %}
|
||||||
|
|
||||||
|
{% for outbox_object in objects %}
|
||||||
|
{{ outbox_object.likes_count }}
|
||||||
|
{{ outbox_object.announces_count }}
|
||||||
|
{{ utils.display_object(outbox_object) }}
|
||||||
|
{% endfor %}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
{% endblock %}
|
46
app/templates/layout.html
Normal file
46
app/templates/layout.html
Normal file
|
@ -0,0 +1,46 @@
|
||||||
|
<!DOCTYPE HTML>
|
||||||
|
<html lang="en">
|
||||||
|
<head>
|
||||||
|
<meta charset="utf-8">
|
||||||
|
<meta http-equiv="x-ua-compatible" content="ie=edge">
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
|
||||||
|
<link rel="stylesheet" href="/static/css/main.css">
|
||||||
|
<style>
|
||||||
|
{{ highlight_css }}
|
||||||
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<div id="main">
|
||||||
|
<main>
|
||||||
|
{% if is_admin %}
|
||||||
|
<div id="admin">
|
||||||
|
{% macro admin_link(url, text) %}
|
||||||
|
{% set url_for = request.url_for(url) %}
|
||||||
|
<a href="{{ url_for }}" {% if request.url == url_for %}class="active"{% endif %}>{{ text }}</a>
|
||||||
|
{% endmacro %}
|
||||||
|
<div style="margin-bottom:30px;">
|
||||||
|
<nav class="flexbox">
|
||||||
|
<ul>
|
||||||
|
<li>Admin</li>
|
||||||
|
<li>{{ admin_link("index", "Public") }}</li>
|
||||||
|
<li>{{ admin_link("admin_new", "New") }}</li>
|
||||||
|
<li>{{ admin_link("stream", "Stream") }}</li>
|
||||||
|
<li>{{ admin_link("get_notifications", "Notifications") }} {% if notifications_count %}({{ notifications_count }}){% endif %}</li>
|
||||||
|
<li>{{ admin_link("get_lookup", "Lookup") }}</li>
|
||||||
|
<li><a href="">Bookmarks</a></li>
|
||||||
|
<li><a href="{{ request.url_for("logout")}}">Logout</a></li>
|
||||||
|
</ul>
|
||||||
|
</nav>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
{% block content %}{% endblock %}
|
||||||
|
</main>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<footer class="footer">
|
||||||
|
Powered by <a href="https://microblog.pub">microblog.pub</a> <small class="microblogpub-version"><code>{{ microblogpub_version }}</code></small> (<a href="https://github.com/tsileo/microblog.pub">source code</a>) and the <a href="https://activitypub.rocks/">ActivityPub</a> protocol
|
||||||
|
</footer>
|
||||||
|
</body>
|
||||||
|
</html>
|
13
app/templates/login.html
Normal file
13
app/templates/login.html
Normal file
|
@ -0,0 +1,13 @@
|
||||||
|
{%- import "utils.html" as utils with context -%}
|
||||||
|
{% extends "layout.html" %}
|
||||||
|
{% block content %}
|
||||||
|
<div style="display:grid;height:80%;">
|
||||||
|
<div style="margin:auto;">
|
||||||
|
<form action="/admin/login" method="POST">
|
||||||
|
<input type="hidden" name="csrf_token" value="{{ csrf_token }}">
|
||||||
|
<input type="password" name="password">
|
||||||
|
<input type="submit" value="Login">
|
||||||
|
</form>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endblock %}
|
14
app/templates/lookup.html
Normal file
14
app/templates/lookup.html
Normal file
|
@ -0,0 +1,14 @@
|
||||||
|
{%- import "utils.html" as utils with context -%}
|
||||||
|
{% extends "layout.html" %}
|
||||||
|
{% block content %}
|
||||||
|
<form action="{{ url_for("get_lookup") }}" method="GET">
|
||||||
|
<input type="text" name="query" value="{{ query if query else "" }}">
|
||||||
|
<input type="submit" value="Lookup">
|
||||||
|
</form>
|
||||||
|
{{ actors_metadata }}
|
||||||
|
{% if ap_object and ap_object.ap_type == "Person" %}
|
||||||
|
{{ utils.display_actor(ap_object, actors_metadata) }}
|
||||||
|
{% elif ap_object %}
|
||||||
|
{{ utils.display_object(ap_object) }}
|
||||||
|
{% endif %}
|
||||||
|
{% endblock %}
|
45
app/templates/notifications.html
Normal file
45
app/templates/notifications.html
Normal file
|
@ -0,0 +1,45 @@
|
||||||
|
{%- import "utils.html" as utils with context -%}
|
||||||
|
{% extends "layout.html" %}
|
||||||
|
{% block content %}
|
||||||
|
<h2>Notifications</h2>
|
||||||
|
<div id="notifications">
|
||||||
|
{%- for notif in notifications %}
|
||||||
|
<div>
|
||||||
|
{%- if notif.notification_type.value == "new_follower" %}
|
||||||
|
<div title="{{ notif.created_at.isoformat() }}">
|
||||||
|
<a style="font-weight:bold;" href="{{ notif.actor.url }}">{{ notif.actor.name or notif.actor.preferred_username }}</a> followed you
|
||||||
|
</div>
|
||||||
|
{{ utils.display_actor(notif.actor, actors_metadata) }}
|
||||||
|
{% elif notif.notification_type.value == "unfollow" %}
|
||||||
|
<div title="{{ notif.created_at.isoformat() }}">
|
||||||
|
<a style="font-weight:bold;" href="{{ notif.actor.url }}">{{ notif.actor.name or notif.actor.preferred_username }}</a> unfollowed you
|
||||||
|
</div>
|
||||||
|
{{ utils.display_actor(notif.actor, actors_metadata) }}
|
||||||
|
{% elif notif.notification_type.value == "like" %}
|
||||||
|
<div title="{{ notif.created_at.isoformat() }}">
|
||||||
|
<a style="font-weight:bold;" href="{{ notif.actor.url }}">{{ notif.actor.name or notif.actor.preferred_username }}</a> liked a post
|
||||||
|
</div>
|
||||||
|
{{ utils.display_object(notif.outbox_object) }}
|
||||||
|
{% elif notif.notification_type.value == "undo_like" %}
|
||||||
|
<div title="{{ notif.created_at.isoformat() }}">
|
||||||
|
<a style="font-weight:bold;" href="{{ notif.actor.url }}">{{ notif.actor.name or notif.actor.preferred_username }}</a> un-liked a post
|
||||||
|
</div>
|
||||||
|
{{ utils.display_object(notif.outbox_object) }}
|
||||||
|
{% elif notif.notification_type.value == "announce" %}
|
||||||
|
<div title="{{ notif.created_at.isoformat() }}">
|
||||||
|
<a style="font-weight:bold;" href="{{ notif.actor.url }}">{{ notif.actor.name or notif.actor.preferred_username }}</a> boosted a post
|
||||||
|
</div>
|
||||||
|
{{ utils.display_object(notif.outbox_object) }}
|
||||||
|
{% elif notif.notification_type.value == "undo_announce" %}
|
||||||
|
<div title="{{ notif.created_at.isoformat() }}">
|
||||||
|
<a style="font-weight:bold;" href="{{ notif.actor.url }}">{{ notif.actor.name or notif.actor.preferred_username }}</a> un-boosted a post
|
||||||
|
</div>
|
||||||
|
{{ utils.display_object(notif.outbox_object) }}
|
||||||
|
|
||||||
|
{% else %}
|
||||||
|
{{ notif }}
|
||||||
|
{%- endif %}
|
||||||
|
</div>
|
||||||
|
{%- endfor %}
|
||||||
|
</div>
|
||||||
|
{% endblock %}
|
8
app/templates/object.html
Normal file
8
app/templates/object.html
Normal file
|
@ -0,0 +1,8 @@
|
||||||
|
{%- import "utils.html" as utils with context -%}
|
||||||
|
{% extends "layout.html" %}
|
||||||
|
{% block content %}
|
||||||
|
{% include "header.html" %}
|
||||||
|
|
||||||
|
{{ utils.display_object(outbox_object) }}
|
||||||
|
|
||||||
|
{% endblock %}
|
143
app/templates/utils.html
Normal file
143
app/templates/utils.html
Normal file
|
@ -0,0 +1,143 @@
|
||||||
|
{% macro embed_csrf_token() %}
|
||||||
|
<input type="hidden" name="csrf_token" value="{{ csrf_token }}">
|
||||||
|
{% endmacro %}
|
||||||
|
|
||||||
|
{% macro embed_redirect_url() %}
|
||||||
|
<input type="hidden" name="redirect_url" value="{{ request.url }}">
|
||||||
|
{% endmacro %}
|
||||||
|
|
||||||
|
{% macro admin_follow_button(actor) %}
|
||||||
|
<form action="{{ request.url_for("admin_actions_follow") }}" method="POST">
|
||||||
|
{{ embed_csrf_token() }}
|
||||||
|
{{ embed_redirect_url() }}
|
||||||
|
<input type="hidden" name="ap_actor_id" value="{{ actor.ap_id }}">
|
||||||
|
<input type="submit" value="Follow">
|
||||||
|
</form>
|
||||||
|
{% endmacro %}
|
||||||
|
|
||||||
|
{% macro admin_like_button(ap_object_id) %}
|
||||||
|
<form action="{{ request.url_for("admin_actions_like") }}" method="POST">
|
||||||
|
{{ embed_csrf_token() }}
|
||||||
|
{{ embed_redirect_url() }}
|
||||||
|
<input type="hidden" name="ap_object_id" value="{{ ap_object_id }}">
|
||||||
|
<input type="submit" value="Like">
|
||||||
|
</form>
|
||||||
|
{% endmacro %}
|
||||||
|
|
||||||
|
{% macro admin_announce_button(ap_object_id) %}
|
||||||
|
<form action="{{ request.url_for("admin_actions_announce") }}" method="POST">
|
||||||
|
{{ embed_csrf_token() }}
|
||||||
|
{{ embed_redirect_url() }}
|
||||||
|
<input type="hidden" name="ap_object_id" value="{{ ap_object_id }}">
|
||||||
|
<input type="submit" value="Announce">
|
||||||
|
</form>
|
||||||
|
{% endmacro %}
|
||||||
|
|
||||||
|
{% macro admin_undo_button(ap_object_id, action="Undo") %}
|
||||||
|
<form action="{{ request.url_for("admin_actions_undo") }}" method="POST">
|
||||||
|
{{ embed_csrf_token() }}
|
||||||
|
{{ embed_redirect_url() }}
|
||||||
|
<input type="hidden" name="ap_object_id" value="{{ ap_object_id }}">
|
||||||
|
<input type="submit" value="{{ action }}">
|
||||||
|
</form>
|
||||||
|
{% endmacro %}
|
||||||
|
|
||||||
|
{% macro sensitive_button(permalink_id) %}
|
||||||
|
<form action="" method="GET">
|
||||||
|
<input type="hidden" name="show_sensitive" value="{{ permalink_id }}">
|
||||||
|
{% for k, v in request.query_params.items() %}
|
||||||
|
<input type="hidden" name="{{k}}" value="{{v}}">
|
||||||
|
{% endfor %}
|
||||||
|
<button type="submit">display sensitive content</button>
|
||||||
|
</form>
|
||||||
|
{% endmacro %}
|
||||||
|
|
||||||
|
{% macro display_actor(actor, actors_metadata) %}
|
||||||
|
{{ actors_metadata }}
|
||||||
|
{% set metadata = actors_metadata.get(actor.ap_id) %}
|
||||||
|
<div style="display: flex;column-gap: 20px;margin:20px 0 10px 0;" class="actor-box">
|
||||||
|
<div style="flex: 0 0 48px;">
|
||||||
|
<img src="{{ actor.icon_url | media_proxy_url }}" style="max-width:45px;">
|
||||||
|
</div>
|
||||||
|
<a href="{{ actor.url }}" style="">
|
||||||
|
<div><strong>{{ actor.name or actor.preferred_username }}</strong></div>
|
||||||
|
<div>{{ actor.handle }}</div>
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
{% if metadata %}
|
||||||
|
<div>
|
||||||
|
<nav class="flexbox">
|
||||||
|
<ul>
|
||||||
|
<li>
|
||||||
|
{% if metadata.is_following %}already following {{ admin_undo_button(metadata.outbox_follow_ap_id, "Unfollow")}}
|
||||||
|
{% elif metadata.is_follow_request_sent %}follow request sent
|
||||||
|
{% else %}
|
||||||
|
{{ admin_follow_button(actor) }}
|
||||||
|
{% endif %}
|
||||||
|
</li>
|
||||||
|
<li>
|
||||||
|
{% if metadata.is_follower %}follows you{% else %}
|
||||||
|
{% endif %}
|
||||||
|
</li>
|
||||||
|
</ul>
|
||||||
|
</nav>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
{% endmacro %}
|
||||||
|
|
||||||
|
{% macro display_object(object) %}
|
||||||
|
{% if object.ap_type in ["Note", "Article", "Video"] %}
|
||||||
|
<div class="activity-wrap" id="{{ object.permalink_id }}">
|
||||||
|
<div class="activity-content">
|
||||||
|
<img src="{% if object.actor.icon_url %}{{ object.actor.icon_url | media_proxy_url }}{% else %}/static/nopic.png{% endif %}" alt="" class="actor-icon">
|
||||||
|
<div class="activity-header">
|
||||||
|
<strong>{{ object.actor.name or object.actor.preferred_username }}</strong>
|
||||||
|
<span>{{ object.actor.handle }}</span>
|
||||||
|
<span class="activity-date" title="{{ object.ap_published_at.isoformat() }}">
|
||||||
|
<a href="{{ object.url }}">{{ object.ap_published_at | timeago }}</a>
|
||||||
|
</span>
|
||||||
|
<div class="activity-main">
|
||||||
|
{{ object.content | clean_html | safe }}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% if object.attachments and object.sensitive and not request.query_params["show_sensitive"] == object.permalink_id %}
|
||||||
|
<div class="activity-attachment">
|
||||||
|
{{ sensitive_button(object.permalink_id )}}
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
{% if object.attachments and (not object.sensitive or (object.sensitive and request.query_params["show_sensitive"] == object.permalink_id)) %}
|
||||||
|
<div class="activity-attachment">
|
||||||
|
{% for attachment in object.attachments %}
|
||||||
|
{% if attachment.type == "Image" or (attachment | has_media_type("image")) %}
|
||||||
|
<img src="{{ attachment.url | media_proxy_url }}"{% if attachment.name %} alt="{{ attachment.name }}"{% endif %} class="attachment">
|
||||||
|
{% elif attachment.type == "Video" or (attachment | has_media_type("video")) %}
|
||||||
|
<video controls preload="metadata" src="{{ attachment.url | media_proxy_url }}"{% if attachment.name %} title="{{ attachment.name }}"{% endif %} class="attachmeent"></video>
|
||||||
|
{% elif attachment.type == "Audio" or (attachment | has_media_type("audio")) %}
|
||||||
|
<audio controls preload="metadata" src="{{ attachment.url | media_proxy_url }}"{% if attachment.name%} title="{{ attachment.name }}"{% endif %} style="width:480px;" class="attachment"></audio>
|
||||||
|
{% else %}
|
||||||
|
<a href="{{ attachment.url | media_proxy_url }}"{% if attachment.name %} title="{{ attachment.name }}"{% endif %} class="attachment">{{ attachment.url }}</a>
|
||||||
|
{% endif %}
|
||||||
|
{% endfor %}
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
<div class="activity-bar">
|
||||||
|
<div class="bar-item">
|
||||||
|
<div class="comment-count">33</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="bar-item">
|
||||||
|
<div class="retweet-count">397</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="bar-item">
|
||||||
|
<div class="likes-count">
|
||||||
|
2.6k
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
{% endmacro %}
|
61
app/urlutils.py
Normal file
61
app/urlutils.py
Normal file
|
@ -0,0 +1,61 @@
|
||||||
|
import functools
|
||||||
|
import ipaddress
|
||||||
|
import socket
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
|
||||||
|
from loguru import logger
|
||||||
|
|
||||||
|
from app.config import DEBUG
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidURLError(Exception):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
@functools.lru_cache
|
||||||
|
def _getaddrinfo(hostname: str, port: int) -> str:
|
||||||
|
try:
|
||||||
|
ip_address = str(ipaddress.ip_address(hostname))
|
||||||
|
except ValueError:
|
||||||
|
try:
|
||||||
|
ip_address = socket.getaddrinfo(hostname, port)[0][4][0]
|
||||||
|
logger.debug(f"DNS lookup: {hostname} -> {ip_address}")
|
||||||
|
except socket.gaierror:
|
||||||
|
logger.exception(f"failed to lookup addr info for {hostname}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
return ip_address
|
||||||
|
|
||||||
|
|
||||||
|
def is_url_valid(url: str) -> bool:
|
||||||
|
"""Implements basic SSRF protection."""
|
||||||
|
parsed = urlparse(url)
|
||||||
|
if parsed.scheme not in ["http", "https"]:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# XXX in debug mode, we want to allow requests to localhost to test the
|
||||||
|
# federation with local instances
|
||||||
|
if DEBUG: # pragma: no cover
|
||||||
|
return True
|
||||||
|
|
||||||
|
if not parsed.hostname or parsed.hostname.lower() in ["localhost"]:
|
||||||
|
return False
|
||||||
|
|
||||||
|
ip_address = _getaddrinfo(
|
||||||
|
parsed.hostname, parsed.port or (80 if parsed.scheme == "http" else 443)
|
||||||
|
)
|
||||||
|
logger.debug(f"{ip_address=}")
|
||||||
|
|
||||||
|
if ipaddress.ip_address(ip_address).is_private:
|
||||||
|
logger.info(f"rejecting private URL {url} -> {ip_address}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def check_url(url: str, debug: bool = False) -> None:
|
||||||
|
logger.debug(f"check_url {url=}")
|
||||||
|
if not is_url_valid(url):
|
||||||
|
raise InvalidURLError(f'"{url}" is invalid')
|
||||||
|
|
||||||
|
return None
|
79
app/webfinger.py
Normal file
79
app/webfinger.py
Normal file
|
@ -0,0 +1,79 @@
|
||||||
|
from typing import Any
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
|
||||||
|
import httpx
|
||||||
|
from loguru import logger
|
||||||
|
|
||||||
|
from app import activitypub as ap
|
||||||
|
|
||||||
|
|
||||||
|
def webfinger(
|
||||||
|
resource: str,
|
||||||
|
) -> dict[str, Any] | None: # noqa: C901
|
||||||
|
"""Mastodon-like WebFinger resolution to retrieve the activity stream Actor URL."""
|
||||||
|
logger.info(f"performing webfinger resolution for {resource}")
|
||||||
|
protos = ["https", "http"]
|
||||||
|
if resource.startswith("http://"):
|
||||||
|
protos.reverse()
|
||||||
|
host = urlparse(resource).netloc
|
||||||
|
elif resource.startswith("https://"):
|
||||||
|
host = urlparse(resource).netloc
|
||||||
|
else:
|
||||||
|
if resource.startswith("acct:"):
|
||||||
|
resource = resource[5:]
|
||||||
|
if resource.startswith("@"):
|
||||||
|
resource = resource[1:]
|
||||||
|
_, host = resource.split("@", 1)
|
||||||
|
resource = "acct:" + resource
|
||||||
|
|
||||||
|
is_404 = False
|
||||||
|
|
||||||
|
for i, proto in enumerate(protos):
|
||||||
|
try:
|
||||||
|
url = f"{proto}://{host}/.well-known/webfinger"
|
||||||
|
resp = ap.get(url, params={"resource": resource})
|
||||||
|
break
|
||||||
|
except httpx.HTTPStatusError as http_error:
|
||||||
|
logger.exception("HTTP error")
|
||||||
|
if http_error.response.status_code in [403, 404, 410]:
|
||||||
|
is_404 = True
|
||||||
|
continue
|
||||||
|
raise
|
||||||
|
except httpx.HTTPError:
|
||||||
|
logger.exception("req failed")
|
||||||
|
# If we tried https first and the domain is "http only"
|
||||||
|
if i == 0:
|
||||||
|
continue
|
||||||
|
break
|
||||||
|
if is_404:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return resp
|
||||||
|
|
||||||
|
|
||||||
|
def get_remote_follow_template(resource: str) -> str | None:
|
||||||
|
data = webfinger(resource)
|
||||||
|
if data is None:
|
||||||
|
return None
|
||||||
|
for link in data["links"]:
|
||||||
|
if link.get("rel") == "http://ostatus.org/schema/1.0/subscribe":
|
||||||
|
return link.get("template")
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def get_actor_url(resource: str) -> str | None:
|
||||||
|
"""Mastodon-like WebFinger resolution to retrieve the activity stream Actor URL.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
the Actor URL or None if the resolution failed.
|
||||||
|
"""
|
||||||
|
data = webfinger(resource)
|
||||||
|
if data is None:
|
||||||
|
return None
|
||||||
|
for link in data["links"]:
|
||||||
|
if (
|
||||||
|
link.get("rel") == "self"
|
||||||
|
and link.get("type") == "application/activity+json"
|
||||||
|
):
|
||||||
|
return link.get("href")
|
||||||
|
return None
|
8
boussole.json
Normal file
8
boussole.json
Normal file
|
@ -0,0 +1,8 @@
|
||||||
|
{
|
||||||
|
"SOURCES_PATH": "scss",
|
||||||
|
"TARGET_PATH": "app/static/css",
|
||||||
|
"LIBRARY_PATHS": [],
|
||||||
|
"OUTPUT_STYLES": "nested",
|
||||||
|
"SOURCE_COMMENTS": false,
|
||||||
|
"EXCLUDES": []
|
||||||
|
}
|
3
data/.gitignore
vendored
Normal file
3
data/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
||||||
|
*
|
||||||
|
!uploads/
|
||||||
|
!.gitignore
|
2
data/uploads/.gitignore
vendored
Normal file
2
data/uploads/.gitignore
vendored
Normal file
|
@ -0,0 +1,2 @@
|
||||||
|
*
|
||||||
|
!.gitignore
|
1697
poetry.lock
generated
Normal file
1697
poetry.lock
generated
Normal file
File diff suppressed because it is too large
Load diff
68
pyproject.toml
Normal file
68
pyproject.toml
Normal file
|
@ -0,0 +1,68 @@
|
||||||
|
[tool.poetry]
|
||||||
|
name = "microblogpub"
|
||||||
|
version = "2.0.0"
|
||||||
|
description = ""
|
||||||
|
authors = ["Thomas Sileo <t@a4.io>"]
|
||||||
|
license = "AGPL-3.0"
|
||||||
|
|
||||||
|
[tool.poetry.dependencies]
|
||||||
|
python = "^3.10"
|
||||||
|
Jinja2 = "^3.1.2"
|
||||||
|
fastapi = "^0.78.0"
|
||||||
|
uvicorn = "^0.17.6"
|
||||||
|
pycryptodome = "^3.14.1"
|
||||||
|
bcrypt = "^3.2.2"
|
||||||
|
itsdangerous = "^2.1.2"
|
||||||
|
python-multipart = "^0.0.5"
|
||||||
|
tomli = "^2.0.1"
|
||||||
|
httpx = "^0.23.0"
|
||||||
|
timeago = "^1.0.15"
|
||||||
|
SQLAlchemy = {extras = ["mypy"], version = "^1.4.37"}
|
||||||
|
alembic = "^1.8.0"
|
||||||
|
bleach = "^5.0.0"
|
||||||
|
requests = "^2.27.1"
|
||||||
|
Markdown = "^3.3.7"
|
||||||
|
prompt-toolkit = "^3.0.29"
|
||||||
|
tomli-w = "^1.0.0"
|
||||||
|
python-dateutil = "^2.8.2"
|
||||||
|
bs4 = "^0.0.1"
|
||||||
|
html5lib = "^1.1"
|
||||||
|
mf2py = "^1.1.2"
|
||||||
|
Pygments = "^2.12.0"
|
||||||
|
types-python-dateutil = "^2.8.17"
|
||||||
|
loguru = "^0.6.0"
|
||||||
|
mdx-linkify = "^2.1"
|
||||||
|
|
||||||
|
[tool.poetry.dev-dependencies]
|
||||||
|
black = "^22.3.0"
|
||||||
|
flake8 = "^4.0.1"
|
||||||
|
mypy = "^0.960"
|
||||||
|
isort = "^5.10.1"
|
||||||
|
types-requests = "^2.27.29"
|
||||||
|
invoke = "^1.7.1"
|
||||||
|
libsass = "^0.21.0"
|
||||||
|
pytest = "^7.1.2"
|
||||||
|
respx = "^0.19.2"
|
||||||
|
boussole = "^2.0.0"
|
||||||
|
types-bleach = "^5.0.2"
|
||||||
|
types-Markdown = "^3.3.28"
|
||||||
|
factory-boy = "^3.2.1"
|
||||||
|
pytest-asyncio = "^0.18.3"
|
||||||
|
|
||||||
|
[build-system]
|
||||||
|
requires = ["poetry-core>=1.0.0"]
|
||||||
|
build-backend = "poetry.core.masonry.api"
|
||||||
|
|
||||||
|
[tool.isort]
|
||||||
|
profile = "black"
|
||||||
|
|
||||||
|
[tool.mypy]
|
||||||
|
exclude = ["alembic/versions/"]
|
||||||
|
plugins = ["sqlalchemy.ext.mypy.plugin", "pydantic.mypy"]
|
||||||
|
|
||||||
|
[tool.black]
|
||||||
|
extend-exclude = '''
|
||||||
|
/(
|
||||||
|
| alembic/versions
|
||||||
|
)/
|
||||||
|
'''
|
85
scripts/config_wizard.py
Normal file
85
scripts/config_wizard.py
Normal file
|
@ -0,0 +1,85 @@
|
||||||
|
"""Basic wizard for setting up microblog.pub configuration files."""
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
import bcrypt
|
||||||
|
import tomli_w
|
||||||
|
from markdown import markdown # type: ignore
|
||||||
|
from prompt_toolkit import prompt
|
||||||
|
|
||||||
|
from app.key import generate_key
|
||||||
|
from app.key import key_exists
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
print("Welcome to microblog.pub setup wizard\n")
|
||||||
|
print("Generating key...")
|
||||||
|
if key_exists():
|
||||||
|
yn = ""
|
||||||
|
while yn not in ["y", "n"]:
|
||||||
|
yn = prompt(
|
||||||
|
"WARNING, a key already exists, overwrite it? (y/n): ", default="n"
|
||||||
|
).lower()
|
||||||
|
if yn == "y":
|
||||||
|
generate_key()
|
||||||
|
else:
|
||||||
|
generate_key()
|
||||||
|
|
||||||
|
config_file = Path("data/me.toml")
|
||||||
|
|
||||||
|
if config_file.exists():
|
||||||
|
# Spit out the relative path for the "config artifacts"
|
||||||
|
rconfig_file = "data/me.toml"
|
||||||
|
print(
|
||||||
|
f"Existing setup detected, please delete {rconfig_file} "
|
||||||
|
"before restarting the wizard"
|
||||||
|
)
|
||||||
|
sys.exit(2)
|
||||||
|
|
||||||
|
dat: dict[str, Any] = {}
|
||||||
|
print("Your identity will be @{username}@{domain}")
|
||||||
|
dat["domain"] = prompt("domain: ")
|
||||||
|
dat["username"] = prompt("username: ")
|
||||||
|
dat["admin_password"] = bcrypt.hashpw(
|
||||||
|
prompt("admin password: ", is_password=True).encode(), bcrypt.gensalt()
|
||||||
|
).decode()
|
||||||
|
dat["name"] = prompt("name (e.g. John Doe): ", default=dat["username"])
|
||||||
|
dat["summary"] = markdown(
|
||||||
|
prompt(
|
||||||
|
(
|
||||||
|
"summary (short description, in markdown, "
|
||||||
|
"press [ESC] then [ENTER] to submit):\n"
|
||||||
|
),
|
||||||
|
multiline=True,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
dat["https"] = True
|
||||||
|
proto = "https"
|
||||||
|
yn = ""
|
||||||
|
while yn not in ["y", "n"]:
|
||||||
|
yn = prompt("will the site be served via https? (y/n): ", default="y").lower()
|
||||||
|
if yn == "n":
|
||||||
|
dat["https"] = False
|
||||||
|
proto = "http"
|
||||||
|
|
||||||
|
print("Note that you can put your icon/avatar in the static/ directory")
|
||||||
|
dat["icon_url"] = prompt(
|
||||||
|
"icon URL: ", default=f'{proto}://{dat["domain"]}/static/nopic.png'
|
||||||
|
)
|
||||||
|
dat["secret"] = os.urandom(16).hex()
|
||||||
|
|
||||||
|
with config_file.open("w") as f:
|
||||||
|
f.write(tomli_w.dumps(dat))
|
||||||
|
|
||||||
|
print("Done")
|
||||||
|
sys.exit(0)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
try:
|
||||||
|
main()
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
print("Aborted")
|
||||||
|
sys.exit(1)
|
135
scss/main.scss
Normal file
135
scss/main.scss
Normal file
|
@ -0,0 +1,135 @@
|
||||||
|
body {
|
||||||
|
margin: 0;
|
||||||
|
padding: 0;
|
||||||
|
display: flex;
|
||||||
|
min-height: 100vh;
|
||||||
|
flex-direction: column;
|
||||||
|
}
|
||||||
|
#main {
|
||||||
|
flex: 1;
|
||||||
|
}
|
||||||
|
main {
|
||||||
|
max-width: 800px;
|
||||||
|
margin: 20px auto;
|
||||||
|
}
|
||||||
|
footer {
|
||||||
|
max-width: 800px;
|
||||||
|
margin: 20px auto;
|
||||||
|
}
|
||||||
|
#notifications, #followers, #following {
|
||||||
|
ul {
|
||||||
|
list-style-type: none;
|
||||||
|
margin: 0;
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
li {
|
||||||
|
display: inline-block;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
.actor-box {
|
||||||
|
a {
|
||||||
|
text-decoration: none;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#admin {
|
||||||
|
.navbar {
|
||||||
|
display: grid;
|
||||||
|
grid-template-rows: auto;
|
||||||
|
grid-template-columns: 1fr;
|
||||||
|
grid-auto-flow: dense;
|
||||||
|
justify-items: stretch;
|
||||||
|
align-items: stretch;
|
||||||
|
column-gap: 20px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.logo {
|
||||||
|
grid-column:-3;
|
||||||
|
padding: 5px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.menus {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: row;
|
||||||
|
justify-content: start;
|
||||||
|
grid-column: 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
.menus * {
|
||||||
|
padding: 5px;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
nav.flexbox {
|
||||||
|
display: flex;
|
||||||
|
justify-content: space-between;
|
||||||
|
align-items: center;
|
||||||
|
|
||||||
|
ul {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
list-style-type: none;
|
||||||
|
margin: 0;
|
||||||
|
padding: 0;
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
ul li {
|
||||||
|
margin-right: 20px;
|
||||||
|
|
||||||
|
&:last-child {
|
||||||
|
margin-right: 0px;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
#admin {
|
||||||
|
a.active {
|
||||||
|
font-weight: bold;
|
||||||
|
text-decoration: none;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
.activity-wrap {
|
||||||
|
margin: 0 auto;
|
||||||
|
padding: 30px 0;
|
||||||
|
.actor-icon {
|
||||||
|
width:48px;
|
||||||
|
margin-right: 15px;
|
||||||
|
img {
|
||||||
|
max-width: 48px;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
.activity-content {
|
||||||
|
display: flex;
|
||||||
|
align-items:flex-start;
|
||||||
|
.activity-header {
|
||||||
|
width: 100%;
|
||||||
|
strong {
|
||||||
|
font-weight:bold;
|
||||||
|
}
|
||||||
|
span {
|
||||||
|
font-weight:normal;
|
||||||
|
margin-left: 5px;
|
||||||
|
}
|
||||||
|
.activity-date { float:right; }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
.activity-attachment {
|
||||||
|
padding-left: 60px;
|
||||||
|
img, audio, video {
|
||||||
|
width: 100%;
|
||||||
|
max-width: 740px;
|
||||||
|
margin: 30px 0;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
.activity-bar {
|
||||||
|
display: flex;
|
||||||
|
margin-left: 60px;
|
||||||
|
margin-top: 10px;
|
||||||
|
.bar-item {
|
||||||
|
display: flex;
|
||||||
|
margin-right: 20px;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
67
tasks.py
Normal file
67
tasks.py
Normal file
|
@ -0,0 +1,67 @@
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from invoke import Context # type: ignore
|
||||||
|
from invoke import run # type: ignore
|
||||||
|
from invoke import task # type: ignore
|
||||||
|
|
||||||
|
|
||||||
|
@task
|
||||||
|
def generate_db_migration(ctx, message):
|
||||||
|
# type: (Context, str) -> None
|
||||||
|
run(f'poetry run alembic revision --autogenerate -m "{message}"', echo=True)
|
||||||
|
|
||||||
|
|
||||||
|
@task
|
||||||
|
def migrate_db(ctx):
|
||||||
|
# type: (Context) -> None
|
||||||
|
run("poetry run alembic upgrade head", echo=True)
|
||||||
|
|
||||||
|
|
||||||
|
@task
|
||||||
|
def autoformat(ctx):
|
||||||
|
# type: (Context) -> None
|
||||||
|
run("black .", echo=True)
|
||||||
|
run("isort -sl .", echo=True)
|
||||||
|
|
||||||
|
|
||||||
|
@task
|
||||||
|
def lint(ctx):
|
||||||
|
# type: (Context) -> None
|
||||||
|
run("black --check .", echo=True)
|
||||||
|
run("isort -sl --check-only .", echo=True)
|
||||||
|
run("flake8 .", echo=True)
|
||||||
|
run("mypy .", echo=True)
|
||||||
|
|
||||||
|
|
||||||
|
@task
|
||||||
|
def compile_scss(ctx, watch=False):
|
||||||
|
# type: (Context, bool) -> None
|
||||||
|
if watch:
|
||||||
|
run("poetry run boussole watch", echo=True)
|
||||||
|
else:
|
||||||
|
run("poetry run boussole compile", echo=True)
|
||||||
|
|
||||||
|
|
||||||
|
@task
|
||||||
|
def uvicorn(ctx):
|
||||||
|
# type: (Context) -> None
|
||||||
|
run("poetry run uvicorn app.main:app --no-server-header", pty=True, echo=True)
|
||||||
|
|
||||||
|
|
||||||
|
@task
|
||||||
|
def process_outgoing_activities(ctx):
|
||||||
|
# type: (Context) -> None
|
||||||
|
run("poetry run python app/process_outgoing_activities.py", pty=True, echo=True)
|
||||||
|
|
||||||
|
|
||||||
|
@task
|
||||||
|
def tests(ctx, k=None):
|
||||||
|
# type: (Context, Optional[str]) -> None
|
||||||
|
pytest_args = " -vvv"
|
||||||
|
if k:
|
||||||
|
pytest_args += f" -k {k}"
|
||||||
|
run(
|
||||||
|
f"MICROBLOGPUB_CONFIG_FILE=tests.toml pytest tests{pytest_args}",
|
||||||
|
pty=True,
|
||||||
|
echo=True,
|
||||||
|
)
|
0
tests/__init__.py
Normal file
0
tests/__init__.py
Normal file
49
tests/conftest.py
Normal file
49
tests/conftest.py
Normal file
|
@ -0,0 +1,49 @@
|
||||||
|
from typing import Generator
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
from sqlalchemy import orm
|
||||||
|
|
||||||
|
from app.database import Base
|
||||||
|
from app.database import engine
|
||||||
|
from app.database import get_db
|
||||||
|
from app.main import app
|
||||||
|
|
||||||
|
_Session = orm.sessionmaker(bind=engine, autocommit=False, autoflush=False)
|
||||||
|
|
||||||
|
|
||||||
|
def _get_db_for_testing() -> Generator[orm.Session, None, None]:
|
||||||
|
session = _Session()
|
||||||
|
try:
|
||||||
|
yield session
|
||||||
|
finally:
|
||||||
|
session.close()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def db() -> Generator:
|
||||||
|
Base.metadata.create_all(bind=engine)
|
||||||
|
yield orm.scoped_session(orm.sessionmaker(bind=engine))
|
||||||
|
try:
|
||||||
|
Base.metadata.drop_all(bind=engine)
|
||||||
|
except Exception:
|
||||||
|
# XXX: for some reason, the teardown occasionally fails because of this
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def exclude_fastapi_middleware():
|
||||||
|
"""Workaround for https://github.com/encode/starlette/issues/472"""
|
||||||
|
user_middleware = app.user_middleware.copy()
|
||||||
|
app.user_middleware = []
|
||||||
|
app.middleware_stack = app.build_middleware_stack()
|
||||||
|
yield
|
||||||
|
app.user_middleware = user_middleware
|
||||||
|
app.middleware_stack = app.build_middleware_stack()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def client(db, exclude_fastapi_middleware) -> Generator:
|
||||||
|
app.dependency_overrides[get_db] = _get_db_for_testing
|
||||||
|
with TestClient(app) as c:
|
||||||
|
yield c
|
140
tests/factories.py
Normal file
140
tests/factories.py
Normal file
|
@ -0,0 +1,140 @@
|
||||||
|
from uuid import uuid4
|
||||||
|
|
||||||
|
import factory # type: ignore
|
||||||
|
from Crypto.PublicKey import RSA
|
||||||
|
from sqlalchemy import orm
|
||||||
|
|
||||||
|
from app import activitypub as ap
|
||||||
|
from app import actor
|
||||||
|
from app import models
|
||||||
|
from app.actor import RemoteActor
|
||||||
|
from app.ap_object import RemoteObject
|
||||||
|
from app.database import engine
|
||||||
|
|
||||||
|
_Session = orm.scoped_session(orm.sessionmaker(bind=engine))
|
||||||
|
|
||||||
|
|
||||||
|
def generate_key() -> tuple[str, str]:
|
||||||
|
k = RSA.generate(1024)
|
||||||
|
return k.exportKey("PEM").decode(), k.publickey().exportKey("PEM").decode()
|
||||||
|
|
||||||
|
|
||||||
|
def build_follow_activity(
|
||||||
|
from_remote_actor: actor.RemoteActor,
|
||||||
|
for_remote_actor: actor.RemoteActor,
|
||||||
|
outbox_public_id: str | None = None,
|
||||||
|
) -> ap.RawObject:
|
||||||
|
return {
|
||||||
|
"@context": ap.AS_CTX,
|
||||||
|
"type": "Follow",
|
||||||
|
"id": from_remote_actor.ap_id + "/follow/" + (outbox_public_id or uuid4().hex),
|
||||||
|
"actor": from_remote_actor.ap_id,
|
||||||
|
"object": for_remote_actor.ap_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def build_accept_activity(
|
||||||
|
from_remote_actor: actor.RemoteActor,
|
||||||
|
for_remote_object: RemoteObject,
|
||||||
|
outbox_public_id: str | None = None,
|
||||||
|
) -> ap.RawObject:
|
||||||
|
return {
|
||||||
|
"@context": ap.AS_CTX,
|
||||||
|
"type": "Accept",
|
||||||
|
"id": from_remote_actor.ap_id + "/accept/" + (outbox_public_id or uuid4().hex),
|
||||||
|
"actor": from_remote_actor.ap_id,
|
||||||
|
"object": for_remote_object.ap_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class BaseModelMeta:
|
||||||
|
sqlalchemy_session = _Session
|
||||||
|
sqlalchemy_session_persistence = "commit"
|
||||||
|
|
||||||
|
|
||||||
|
class RemoteActorFactory(factory.Factory):
|
||||||
|
class Meta:
|
||||||
|
model = RemoteActor
|
||||||
|
exclude = (
|
||||||
|
"base_url",
|
||||||
|
"username",
|
||||||
|
"public_key",
|
||||||
|
)
|
||||||
|
|
||||||
|
class Params:
|
||||||
|
icon_url = None
|
||||||
|
summary = "I like unit tests"
|
||||||
|
|
||||||
|
ap_actor = factory.LazyAttribute(
|
||||||
|
lambda o: {
|
||||||
|
"@context": ap.AS_CTX,
|
||||||
|
"type": "Person",
|
||||||
|
"id": o.base_url,
|
||||||
|
"following": o.base_url + "/following",
|
||||||
|
"followers": o.base_url + "/followers",
|
||||||
|
# "featured": ID + "/featured",
|
||||||
|
"inbox": o.base_url + "/inbox",
|
||||||
|
"outbox": o.base_url + "/outbox",
|
||||||
|
"preferredUsername": o.username,
|
||||||
|
"name": o.username,
|
||||||
|
"summary": o.summary,
|
||||||
|
"endpoints": {},
|
||||||
|
"url": o.base_url,
|
||||||
|
"manuallyApprovesFollowers": False,
|
||||||
|
"attachment": [],
|
||||||
|
"icon": {},
|
||||||
|
"publicKey": {
|
||||||
|
"id": f"{o.base_url}#main-key",
|
||||||
|
"owner": o.base_url,
|
||||||
|
"publicKeyPem": o.public_key,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class ActorFactory(factory.alchemy.SQLAlchemyModelFactory):
|
||||||
|
class Meta(BaseModelMeta):
|
||||||
|
model = models.Actor
|
||||||
|
|
||||||
|
# ap_actor
|
||||||
|
# ap_id
|
||||||
|
ap_type = "Person"
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_remote_actor(cls, ra):
|
||||||
|
return cls(
|
||||||
|
ap_type=ra.ap_type,
|
||||||
|
ap_actor=ra.ap_actor,
|
||||||
|
ap_id=ra.ap_id,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class OutboxObjectFactory(factory.alchemy.SQLAlchemyModelFactory):
|
||||||
|
class Meta(BaseModelMeta):
|
||||||
|
model = models.OutboxObject
|
||||||
|
|
||||||
|
# public_id
|
||||||
|
# relates_to_inbox_object_id
|
||||||
|
# relates_to_outbox_object_id
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_remote_object(cls, public_id, ro):
|
||||||
|
return cls(
|
||||||
|
public_id=public_id,
|
||||||
|
ap_type=ro.ap_type,
|
||||||
|
ap_id=ro.ap_id,
|
||||||
|
ap_context=ro.context,
|
||||||
|
ap_object=ro.ap_object,
|
||||||
|
visibility=ro.visibility,
|
||||||
|
og_meta=ro.og_meta,
|
||||||
|
activity_object_ap_id=ro.activity_object_ap_id,
|
||||||
|
is_hidden_from_homepage=True if ro.in_reply_to else False,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class OutgoingActivityFactory(factory.alchemy.SQLAlchemyModelFactory):
|
||||||
|
class Meta(BaseModelMeta):
|
||||||
|
model = models.OutgoingActivity
|
||||||
|
|
||||||
|
# recipient
|
||||||
|
# outbox_object_id
|
27
tests/test.key
Normal file
27
tests/test.key
Normal file
|
@ -0,0 +1,27 @@
|
||||||
|
-----BEGIN RSA PRIVATE KEY-----
|
||||||
|
MIIEowIBAAKCAQEAvYhynEC0l2WVpXoPutfhhZHEeQyyoHiMszOfl1EHM50V0xOC
|
||||||
|
XCoXd/i5Hsa6dWswyjftOtSmdknY5Whr6LatwNu+i/tlsjmHSGgdhUxLhbj4Xc5T
|
||||||
|
LQWxDbS1cg49IwSZFYSIrBw2yfPI3dpMNzYvBt8CKAk0zodypHzdfSKPbSRIyBAy
|
||||||
|
SuG+mJsxsg9tx9CgWNrizauj/zVSWa/cRvNTvIwlxs1J516QJ0px3NygKqPMP2I4
|
||||||
|
zNkhKFzaNDLzuv4zMsW8UNoM+Mlpf6+NbHQycUC9gIqywrP21E7YFmdljyr5cAfr
|
||||||
|
qn+KgDsQTpDSINFE1oUanY0iadKvFXjD9uQLfwIDAQABAoIBAAtqK1TjxLyVfqS/
|
||||||
|
rDDZjZiIxedwb1WgzQCB7GulkqR2Inla5G/+jPlJvoRu/Y3SzdZv9dakNf5LxkdS
|
||||||
|
uaUDU4WY9mnh0ycftdkThCuiA65jDHpB0dqVTCuCJadf2ijAvyN/nueWr2oMR52s
|
||||||
|
5wgwODbWuX+Fxmtl1u63InPF4BN3kEQcGP4pgXMiQ2QEwjxMubG7fZTuHFChsZMZ
|
||||||
|
0QyHy0atmauK8+1FeseoZv7LefgjE+UhAKnIz5z/Ij4erGRaWJUKe5YS7i8nTT6M
|
||||||
|
W+SJ/gs/l6vOUmrqHZaXsp29pvseY23akgGnZciHJfuj/vxMJjGfZVM2ls+MUkh4
|
||||||
|
tdEZ0NECgYEAxRGcRxhQyOdiohcsH4efG03mB7u+JBuvt33oFXWOCpW7lenAr9qg
|
||||||
|
3hm30lZq95ST3XilqGldgIW2zpHCkSLXk/lsJteNC9EEk8HuTDJ7Gd4SBiXisELd
|
||||||
|
IY147SJu5KXN/kaGoDMgMCGcR7Qkr6hzsRT3308A6nMNZG0viyUMzicCgYEA9jXx
|
||||||
|
WaLe0PC8pT/yAyPJnYerSOofv+vz+3KNlopBTSRsREsCpdbyOnGCXa4bechj29Lv
|
||||||
|
0QCbQMkga2pXUPNszdUz7L0LnAi8DZhKumPxyz82kcZSxSCGsvwp9kZju/LPCIHo
|
||||||
|
j1wKW92/w47QXdzCVjgkKbDAGsSwzphEJOuMhukCgYBUKl9KZfIqu9f+TlND7BJi
|
||||||
|
APUbnG1q0oBLp/R1Jc3Sa3zAXCM1d/R4pxdBODNbJhO45QwrT0Tl3TXkJ5Cnl+/m
|
||||||
|
fQJZ3Hma8Fw6FvuFg5HbzGJ6Sbf1e7kh2WAqNyiRctb1oH1i8jLvG4u5fBCnDRTM
|
||||||
|
Lp5mu0Ey4Ix5tcA2d05uxQKBgQDDBiePIPvt9UL4gpZo9kgViAmdUBamJ3izjCGr
|
||||||
|
RQhE2r0Hu4L1ajWlJZRmMCuDY7/1uDhODXTs9GPBshJIBQoCYQcoVvaDOkf7XM6U
|
||||||
|
peY5YHERN08I5qLL1AJJGaiWj9Z+nqhgJj/uVNA5Tz6tmtg1A3Nhsqf4jCShAOu5
|
||||||
|
cvt1QQKBgH2Lg/o9KpFLeZLVXQzW3GFB7RzDetSDbpdhBBE3o/HAtrX0foEqYfKx
|
||||||
|
JuPrlGR2L6Q8jSw7AvFErkx5g5kCgdN8mOYjCe/EsL3ctIatqaoGDrjfvgWAeanW
|
||||||
|
XxMcVRlcMFzp5XB0VQhG0nP9uvHm/eIw/izN2JN7gz3ZZp84lq3S
|
||||||
|
-----END RSA PRIVATE KEY-----
|
46
tests/test_actor.py
Normal file
46
tests/test_actor.py
Normal file
|
@ -0,0 +1,46 @@
|
||||||
|
import httpx
|
||||||
|
import respx
|
||||||
|
|
||||||
|
from app import models
|
||||||
|
from app.actor import fetch_actor
|
||||||
|
from app.database import Session
|
||||||
|
from tests import factories
|
||||||
|
|
||||||
|
|
||||||
|
def test_fetch_actor(db: Session, respx_mock) -> None:
|
||||||
|
# Given a remote actor
|
||||||
|
ra = factories.RemoteActorFactory(
|
||||||
|
base_url="https://example.com",
|
||||||
|
username="toto",
|
||||||
|
public_key="pk",
|
||||||
|
)
|
||||||
|
respx_mock.get(ra.ap_id).mock(return_value=httpx.Response(200, json=ra.ap_actor))
|
||||||
|
|
||||||
|
# When fetching this actor for the first time
|
||||||
|
saved_actor = fetch_actor(db, ra.ap_id)
|
||||||
|
|
||||||
|
# Then it has been fetched and saved in DB
|
||||||
|
assert respx.calls.call_count == 1
|
||||||
|
assert db.query(models.Actor).one().ap_id == saved_actor.ap_id
|
||||||
|
|
||||||
|
# When fetching it a second time
|
||||||
|
actor_from_db = fetch_actor(db, ra.ap_id)
|
||||||
|
|
||||||
|
# Then it's read from the DB
|
||||||
|
assert actor_from_db.ap_id == ra.ap_id
|
||||||
|
assert db.query(models.Actor).count() == 1
|
||||||
|
assert respx.calls.call_count == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_sqlalchemy_factory(db: Session) -> None:
|
||||||
|
ra = factories.RemoteActorFactory(
|
||||||
|
base_url="https://example.com",
|
||||||
|
username="toto",
|
||||||
|
public_key="pk",
|
||||||
|
)
|
||||||
|
actor_in_db = factories.ActorFactory(
|
||||||
|
ap_type=ra.ap_type,
|
||||||
|
ap_actor=ra.ap_actor,
|
||||||
|
ap_id=ra.ap_id,
|
||||||
|
)
|
||||||
|
assert actor_in_db.id == db.query(models.Actor).one().id
|
21
tests/test_admin.py
Normal file
21
tests/test_admin.py
Normal file
|
@ -0,0 +1,21 @@
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
from app.main import app
|
||||||
|
|
||||||
|
|
||||||
|
def test_admin_endpoints_are_authenticated(client: TestClient):
|
||||||
|
routes_tested = []
|
||||||
|
|
||||||
|
for route in app.routes:
|
||||||
|
if not route.path.startswith("/admin") or route.path == "/admin/login":
|
||||||
|
continue
|
||||||
|
|
||||||
|
for method in route.methods:
|
||||||
|
resp = client.request(method, route.path)
|
||||||
|
|
||||||
|
# Admin routes should redirect to the login page
|
||||||
|
assert resp.status_code == 302, f"{method} {route.path} is unauthenticated"
|
||||||
|
assert resp.headers.get("Location") == "http://testserver/admin/login"
|
||||||
|
routes_tested.append((method, route.path))
|
||||||
|
|
||||||
|
assert len(routes_tested) > 0
|
177
tests/test_httpsig.py
Normal file
177
tests/test_httpsig.py
Normal file
|
@ -0,0 +1,177 @@
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
import fastapi
|
||||||
|
import httpx
|
||||||
|
import pytest
|
||||||
|
import respx
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
from app import activitypub as ap
|
||||||
|
from app import httpsig
|
||||||
|
from app.httpsig import HTTPSigInfo
|
||||||
|
from app.key import Key
|
||||||
|
from tests import factories
|
||||||
|
|
||||||
|
_test_app = fastapi.FastAPI()
|
||||||
|
|
||||||
|
|
||||||
|
def _httpsig_info_to_dict(httpsig_info: HTTPSigInfo) -> dict[str, Any]:
|
||||||
|
return {
|
||||||
|
"has_valid_signature": httpsig_info.has_valid_signature,
|
||||||
|
"signed_by_ap_actor_id": httpsig_info.signed_by_ap_actor_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@_test_app.get("/httpsig_checker")
|
||||||
|
def get_httpsig_checker(
|
||||||
|
httpsig_info: httpsig.HTTPSigInfo = fastapi.Depends(httpsig.httpsig_checker),
|
||||||
|
):
|
||||||
|
return _httpsig_info_to_dict(httpsig_info)
|
||||||
|
|
||||||
|
|
||||||
|
@_test_app.post("/enforce_httpsig")
|
||||||
|
async def post_enforce_httpsig(
|
||||||
|
request: fastapi.Request,
|
||||||
|
httpsig_info: httpsig.HTTPSigInfo = fastapi.Depends(httpsig.enforce_httpsig),
|
||||||
|
):
|
||||||
|
await request.json()
|
||||||
|
return _httpsig_info_to_dict(httpsig_info)
|
||||||
|
|
||||||
|
|
||||||
|
def test_enforce_httpsig__no_signature(
|
||||||
|
client: TestClient,
|
||||||
|
respx_mock: respx.MockRouter,
|
||||||
|
) -> None:
|
||||||
|
with TestClient(_test_app) as client:
|
||||||
|
response = client.post(
|
||||||
|
"/enforce_httpsig",
|
||||||
|
headers={"Content-Type": ap.AS_CTX},
|
||||||
|
json={"enforce_httpsig": True},
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json()["detail"] == "Invalid HTTP sig"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_enforce_httpsig__with_valid_signature(
|
||||||
|
client: TestClient,
|
||||||
|
respx_mock: respx.MockRouter,
|
||||||
|
) -> None:
|
||||||
|
# Given a remote actor
|
||||||
|
privkey, pubkey = factories.generate_key()
|
||||||
|
ra = factories.RemoteActorFactory(
|
||||||
|
base_url="https://example.com",
|
||||||
|
username="toto",
|
||||||
|
public_key=pubkey,
|
||||||
|
)
|
||||||
|
k = Key(ra.ap_id, f"{ra.ap_id}#main-key")
|
||||||
|
k.load(privkey)
|
||||||
|
auth = httpsig.HTTPXSigAuth(k)
|
||||||
|
respx_mock.get(ra.ap_id).mock(return_value=httpx.Response(200, json=ra.ap_actor))
|
||||||
|
|
||||||
|
httpsig._get_public_key.cache_clear()
|
||||||
|
|
||||||
|
async with httpx.AsyncClient(app=_test_app, base_url="http://test") as client:
|
||||||
|
response = await client.post(
|
||||||
|
"/enforce_httpsig",
|
||||||
|
headers={"Content-Type": ap.AS_CTX},
|
||||||
|
json={"enforce_httpsig": True},
|
||||||
|
auth=auth, # type: ignore
|
||||||
|
)
|
||||||
|
assert response.status_code == 200
|
||||||
|
|
||||||
|
json_response = response.json()
|
||||||
|
|
||||||
|
assert json_response["has_valid_signature"] is True
|
||||||
|
assert json_response["signed_by_ap_actor_id"] == ra.ap_id
|
||||||
|
|
||||||
|
|
||||||
|
def test_httpsig_checker__no_signature(
|
||||||
|
client: TestClient,
|
||||||
|
respx_mock: respx.MockRouter,
|
||||||
|
) -> None:
|
||||||
|
with TestClient(_test_app) as client:
|
||||||
|
response = client.get(
|
||||||
|
"/httpsig_checker",
|
||||||
|
headers={"Accept": ap.AS_CTX},
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
json_response = response.json()
|
||||||
|
assert json_response["has_valid_signature"] is False
|
||||||
|
assert json_response["signed_by_ap_actor_id"] is None
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_httpsig_checker__with_valid_signature(
|
||||||
|
client: TestClient,
|
||||||
|
respx_mock: respx.MockRouter,
|
||||||
|
) -> None:
|
||||||
|
# Given a remote actor
|
||||||
|
privkey, pubkey = factories.generate_key()
|
||||||
|
ra = factories.RemoteActorFactory(
|
||||||
|
base_url="https://example.com",
|
||||||
|
username="toto",
|
||||||
|
public_key=pubkey,
|
||||||
|
)
|
||||||
|
respx_mock.get(ra.ap_id).mock(return_value=httpx.Response(200, json=ra.ap_actor))
|
||||||
|
k = Key(ra.ap_id, f"{ra.ap_id}#main-key")
|
||||||
|
k.load(privkey)
|
||||||
|
auth = httpsig.HTTPXSigAuth(k)
|
||||||
|
|
||||||
|
httpsig._get_public_key.cache_clear()
|
||||||
|
|
||||||
|
async with httpx.AsyncClient(app=_test_app, base_url="http://test") as client:
|
||||||
|
response = await client.get(
|
||||||
|
"/httpsig_checker",
|
||||||
|
headers={"Accept": ap.AS_CTX},
|
||||||
|
auth=auth, # type: ignore
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
json_response = response.json()
|
||||||
|
|
||||||
|
assert json_response["has_valid_signature"] is True
|
||||||
|
assert json_response["signed_by_ap_actor_id"] == ra.ap_id
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_httpsig_checker__with_invvalid_signature(
|
||||||
|
client: TestClient,
|
||||||
|
respx_mock: respx.MockRouter,
|
||||||
|
) -> None:
|
||||||
|
# Given a remote actor
|
||||||
|
privkey, pubkey = factories.generate_key()
|
||||||
|
ra = factories.RemoteActorFactory(
|
||||||
|
base_url="https://example.com",
|
||||||
|
username="toto",
|
||||||
|
public_key=pubkey,
|
||||||
|
)
|
||||||
|
k = Key(ra.ap_id, f"{ra.ap_id}#main-key")
|
||||||
|
k.load(privkey)
|
||||||
|
auth = httpsig.HTTPXSigAuth(k)
|
||||||
|
|
||||||
|
ra2_privkey, ra2_pubkey = factories.generate_key()
|
||||||
|
ra2 = factories.RemoteActorFactory(
|
||||||
|
base_url="https://example.com",
|
||||||
|
username="toto",
|
||||||
|
public_key=ra2_pubkey,
|
||||||
|
)
|
||||||
|
assert ra.ap_id == ra2.ap_id
|
||||||
|
respx_mock.get(ra.ap_id).mock(return_value=httpx.Response(200, json=ra2.ap_actor))
|
||||||
|
|
||||||
|
httpsig._get_public_key.cache_clear()
|
||||||
|
|
||||||
|
async with httpx.AsyncClient(app=_test_app, base_url="http://test") as client:
|
||||||
|
response = await client.get(
|
||||||
|
"/httpsig_checker",
|
||||||
|
headers={"Accept": ap.AS_CTX},
|
||||||
|
auth=auth, # type: ignore
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
json_response = response.json()
|
||||||
|
|
||||||
|
assert json_response["has_valid_signature"] is False
|
||||||
|
assert json_response["signed_by_ap_actor_id"] == ra.ap_id
|
134
tests/test_inbox.py
Normal file
134
tests/test_inbox.py
Normal file
|
@ -0,0 +1,134 @@
|
||||||
|
from uuid import uuid4
|
||||||
|
|
||||||
|
import httpx
|
||||||
|
import respx
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
from app import activitypub as ap
|
||||||
|
from app import models
|
||||||
|
from app.actor import LOCAL_ACTOR
|
||||||
|
from app.ap_object import RemoteObject
|
||||||
|
from app.database import Session
|
||||||
|
from tests import factories
|
||||||
|
from tests.utils import mock_httpsig_checker
|
||||||
|
|
||||||
|
|
||||||
|
def test_inbox_requires_httpsig(
|
||||||
|
client: TestClient,
|
||||||
|
):
|
||||||
|
response = client.post(
|
||||||
|
"/inbox",
|
||||||
|
headers={"Content-Type": ap.AS_CTX},
|
||||||
|
json={},
|
||||||
|
)
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json()["detail"] == "Invalid HTTP sig"
|
||||||
|
|
||||||
|
|
||||||
|
def test_inbox_follow_request(
|
||||||
|
db: Session,
|
||||||
|
client: TestClient,
|
||||||
|
respx_mock: respx.MockRouter,
|
||||||
|
) -> None:
|
||||||
|
# Given a remote actor
|
||||||
|
ra = factories.RemoteActorFactory(
|
||||||
|
base_url="https://example.com",
|
||||||
|
username="toto",
|
||||||
|
public_key="pk",
|
||||||
|
)
|
||||||
|
respx_mock.get(ra.ap_id).mock(return_value=httpx.Response(200, json=ra.ap_actor))
|
||||||
|
|
||||||
|
# When sending a Follow activity
|
||||||
|
follow_activity = RemoteObject(
|
||||||
|
factories.build_follow_activity(
|
||||||
|
from_remote_actor=ra,
|
||||||
|
for_remote_actor=LOCAL_ACTOR,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
with mock_httpsig_checker(ra):
|
||||||
|
response = client.post(
|
||||||
|
"/inbox",
|
||||||
|
headers={"Content-Type": ap.AS_CTX},
|
||||||
|
json=follow_activity.ap_object,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Then the server returns a 204
|
||||||
|
assert response.status_code == 204
|
||||||
|
|
||||||
|
# And the actor was saved in DB
|
||||||
|
saved_actor = db.query(models.Actor).one()
|
||||||
|
assert saved_actor.ap_id == ra.ap_id
|
||||||
|
|
||||||
|
# And the Follow activity was saved in the inbox
|
||||||
|
inbox_object = db.query(models.InboxObject).one()
|
||||||
|
assert inbox_object.ap_object == follow_activity.ap_object
|
||||||
|
|
||||||
|
# And a follower was internally created
|
||||||
|
follower = db.query(models.Follower).one()
|
||||||
|
assert follower.ap_actor_id == ra.ap_id
|
||||||
|
assert follower.actor_id == saved_actor.id
|
||||||
|
assert follower.inbox_object_id == inbox_object.id
|
||||||
|
|
||||||
|
# And an Accept activity was created in the outbox
|
||||||
|
outbox_object = db.query(models.OutboxObject).one()
|
||||||
|
assert outbox_object.ap_type == "Accept"
|
||||||
|
assert outbox_object.activity_object_ap_id == follow_activity.ap_id
|
||||||
|
|
||||||
|
# And an outgoing activity was created to track the Accept activity delivery
|
||||||
|
outgoing_activity = db.query(models.OutgoingActivity).one()
|
||||||
|
assert outgoing_activity.outbox_object_id == outbox_object.id
|
||||||
|
|
||||||
|
|
||||||
|
def test_inbox_accept_follow_request(
|
||||||
|
db: Session,
|
||||||
|
client: TestClient,
|
||||||
|
respx_mock: respx.MockRouter,
|
||||||
|
) -> None:
|
||||||
|
# Given a remote actor
|
||||||
|
ra = factories.RemoteActorFactory(
|
||||||
|
base_url="https://example.com",
|
||||||
|
username="toto",
|
||||||
|
public_key="pk",
|
||||||
|
)
|
||||||
|
respx_mock.get(ra.ap_id).mock(return_value=httpx.Response(200, json=ra.ap_actor))
|
||||||
|
actor_in_db = factories.ActorFactory.from_remote_actor(ra)
|
||||||
|
|
||||||
|
# And a Follow activity in the outbox
|
||||||
|
follow_id = uuid4().hex
|
||||||
|
follow_from_outbox = RemoteObject(
|
||||||
|
factories.build_follow_activity(
|
||||||
|
from_remote_actor=LOCAL_ACTOR,
|
||||||
|
for_remote_actor=ra,
|
||||||
|
outbox_public_id=follow_id,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
outbox_object = factories.OutboxObjectFactory.from_remote_object(
|
||||||
|
follow_id, follow_from_outbox
|
||||||
|
)
|
||||||
|
|
||||||
|
# When sending a Accept activity
|
||||||
|
accept_activity = RemoteObject(
|
||||||
|
factories.build_accept_activity(
|
||||||
|
from_remote_actor=ra,
|
||||||
|
for_remote_object=follow_from_outbox,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
with mock_httpsig_checker(ra):
|
||||||
|
response = client.post(
|
||||||
|
"/inbox",
|
||||||
|
headers={"Content-Type": ap.AS_CTX},
|
||||||
|
json=accept_activity.ap_object,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Then the server returns a 204
|
||||||
|
assert response.status_code == 204
|
||||||
|
|
||||||
|
# And the Accept activity was saved in the inbox
|
||||||
|
inbox_activity = db.query(models.InboxObject).one()
|
||||||
|
assert inbox_activity.ap_type == "Accept"
|
||||||
|
assert inbox_activity.relates_to_outbox_object_id == outbox_object.id
|
||||||
|
assert inbox_activity.actor_id == actor_in_db.id
|
||||||
|
|
||||||
|
# And a following entry was created internally
|
||||||
|
following = db.query(models.Following).one()
|
||||||
|
assert following.ap_actor_id == actor_in_db.ap_id
|
46
tests/test_outbox.py
Normal file
46
tests/test_outbox.py
Normal file
|
@ -0,0 +1,46 @@
|
||||||
|
import httpx
|
||||||
|
import respx
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
from app import models
|
||||||
|
from app.config import generate_csrf_token
|
||||||
|
from app.database import Session
|
||||||
|
from tests import factories
|
||||||
|
from tests.utils import generate_admin_session_cookies
|
||||||
|
|
||||||
|
|
||||||
|
def test_send_follow_request(
|
||||||
|
db: Session,
|
||||||
|
client: TestClient,
|
||||||
|
respx_mock: respx.MockRouter,
|
||||||
|
) -> None:
|
||||||
|
# Given a remote actor
|
||||||
|
ra = factories.RemoteActorFactory(
|
||||||
|
base_url="https://example.com",
|
||||||
|
username="toto",
|
||||||
|
public_key="pk",
|
||||||
|
)
|
||||||
|
respx_mock.get(ra.ap_id).mock(return_value=httpx.Response(200, json=ra.ap_actor))
|
||||||
|
|
||||||
|
response = client.post(
|
||||||
|
"/admin/actions/follow",
|
||||||
|
data={
|
||||||
|
"redirect_url": "http://testserver/",
|
||||||
|
"ap_actor_id": ra.ap_id,
|
||||||
|
"csrf_token": generate_csrf_token(),
|
||||||
|
},
|
||||||
|
cookies=generate_admin_session_cookies(),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Then the server returns a 302
|
||||||
|
assert response.status_code == 302
|
||||||
|
assert response.headers.get("Location") == "http://testserver/"
|
||||||
|
|
||||||
|
# And the Follow activity was created in the outbox
|
||||||
|
outbox_object = db.query(models.OutboxObject).one()
|
||||||
|
assert outbox_object.ap_type == "Follow"
|
||||||
|
assert outbox_object.activity_object_ap_id == ra.ap_id
|
||||||
|
|
||||||
|
# And an outgoing activity was queued
|
||||||
|
outgoing_activity = db.query(models.OutgoingActivity).one()
|
||||||
|
assert outgoing_activity.outbox_object_id == outbox_object.id
|
180
tests/test_process_outgoing_activities.py
Normal file
180
tests/test_process_outgoing_activities.py
Normal file
|
@ -0,0 +1,180 @@
|
||||||
|
from uuid import uuid4
|
||||||
|
|
||||||
|
import httpx
|
||||||
|
import respx
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
from app import models
|
||||||
|
from app.actor import LOCAL_ACTOR
|
||||||
|
from app.ap_object import RemoteObject
|
||||||
|
from app.database import Session
|
||||||
|
from app.process_outgoing_activities import _MAX_RETRIES
|
||||||
|
from app.process_outgoing_activities import new_outgoing_activity
|
||||||
|
from app.process_outgoing_activities import process_next_outgoing_activity
|
||||||
|
from tests import factories
|
||||||
|
|
||||||
|
|
||||||
|
def _setup_outbox_object() -> models.OutboxObject:
|
||||||
|
ra = factories.RemoteActorFactory(
|
||||||
|
base_url="https://example.com",
|
||||||
|
username="toto",
|
||||||
|
public_key="pk",
|
||||||
|
)
|
||||||
|
|
||||||
|
# And a Follow activity in the outbox
|
||||||
|
follow_id = uuid4().hex
|
||||||
|
follow_from_outbox = RemoteObject(
|
||||||
|
factories.build_follow_activity(
|
||||||
|
from_remote_actor=LOCAL_ACTOR,
|
||||||
|
for_remote_actor=ra,
|
||||||
|
outbox_public_id=follow_id,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
outbox_object = factories.OutboxObjectFactory.from_remote_object(
|
||||||
|
follow_id, follow_from_outbox
|
||||||
|
)
|
||||||
|
return outbox_object
|
||||||
|
|
||||||
|
|
||||||
|
def test_new_outgoing_activity(
|
||||||
|
db: Session,
|
||||||
|
client: TestClient,
|
||||||
|
respx_mock: respx.MockRouter,
|
||||||
|
) -> None:
|
||||||
|
outbox_object = _setup_outbox_object()
|
||||||
|
inbox_url = "https://example.com/inbox"
|
||||||
|
|
||||||
|
# When queuing the activity
|
||||||
|
outgoing_activity = new_outgoing_activity(db, inbox_url, outbox_object.id)
|
||||||
|
|
||||||
|
assert db.query(models.OutgoingActivity).one() == outgoing_activity
|
||||||
|
assert outgoing_activity.outbox_object_id == outbox_object.id
|
||||||
|
assert outgoing_activity.recipient == inbox_url
|
||||||
|
|
||||||
|
|
||||||
|
def test_process_next_outgoing_activity__no_next_activity(
|
||||||
|
db: Session,
|
||||||
|
respx_mock: respx.MockRouter,
|
||||||
|
) -> None:
|
||||||
|
assert process_next_outgoing_activity(db) is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_process_next_outgoing_activity__server_200(
|
||||||
|
db: Session,
|
||||||
|
respx_mock: respx.MockRouter,
|
||||||
|
) -> None:
|
||||||
|
# And an outgoing activity
|
||||||
|
outbox_object = _setup_outbox_object()
|
||||||
|
|
||||||
|
recipient_inbox_url = "https://example.com/users/toto/inbox"
|
||||||
|
respx_mock.post(recipient_inbox_url).mock(return_value=httpx.Response(204))
|
||||||
|
|
||||||
|
outgoing_activity = factories.OutgoingActivityFactory(
|
||||||
|
recipient=recipient_inbox_url,
|
||||||
|
outbox_object_id=outbox_object.id,
|
||||||
|
)
|
||||||
|
|
||||||
|
# When processing the next outgoing activity
|
||||||
|
# Then it is processed
|
||||||
|
assert process_next_outgoing_activity(db) is True
|
||||||
|
|
||||||
|
assert respx_mock.calls.call_count == 1
|
||||||
|
|
||||||
|
outgoing_activity = db.query(models.OutgoingActivity).one()
|
||||||
|
assert outgoing_activity.is_sent is True
|
||||||
|
assert outgoing_activity.last_status_code == 204
|
||||||
|
assert outgoing_activity.error is None
|
||||||
|
assert outgoing_activity.is_errored is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_process_next_outgoing_activity__error_500(
|
||||||
|
db: Session,
|
||||||
|
respx_mock: respx.MockRouter,
|
||||||
|
) -> None:
|
||||||
|
outbox_object = _setup_outbox_object()
|
||||||
|
recipient_inbox_url = "https://example.com/inbox"
|
||||||
|
respx_mock.post(recipient_inbox_url).mock(
|
||||||
|
return_value=httpx.Response(500, text="oops")
|
||||||
|
)
|
||||||
|
|
||||||
|
# And an outgoing activity
|
||||||
|
outgoing_activity = factories.OutgoingActivityFactory(
|
||||||
|
recipient=recipient_inbox_url,
|
||||||
|
outbox_object_id=outbox_object.id,
|
||||||
|
)
|
||||||
|
|
||||||
|
# When processing the next outgoing activity
|
||||||
|
# Then it is processed
|
||||||
|
assert process_next_outgoing_activity(db) is True
|
||||||
|
|
||||||
|
assert respx_mock.calls.call_count == 1
|
||||||
|
|
||||||
|
outgoing_activity = db.query(models.OutgoingActivity).one()
|
||||||
|
assert outgoing_activity.is_sent is False
|
||||||
|
assert outgoing_activity.last_status_code == 500
|
||||||
|
assert outgoing_activity.last_response == "oops"
|
||||||
|
assert outgoing_activity.is_errored is False
|
||||||
|
assert outgoing_activity.tries == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_process_next_outgoing_activity__connect_error(
|
||||||
|
db: Session,
|
||||||
|
respx_mock: respx.MockRouter,
|
||||||
|
) -> None:
|
||||||
|
outbox_object = _setup_outbox_object()
|
||||||
|
recipient_inbox_url = "https://example.com/inbox"
|
||||||
|
respx_mock.post(recipient_inbox_url).mock(side_effect=httpx.ConnectError)
|
||||||
|
|
||||||
|
# And an outgoing activity
|
||||||
|
outgoing_activity = factories.OutgoingActivityFactory(
|
||||||
|
recipient=recipient_inbox_url,
|
||||||
|
outbox_object_id=outbox_object.id,
|
||||||
|
)
|
||||||
|
|
||||||
|
# When processing the next outgoing activity
|
||||||
|
# Then it is processed
|
||||||
|
assert process_next_outgoing_activity(db) is True
|
||||||
|
|
||||||
|
assert respx_mock.calls.call_count == 1
|
||||||
|
|
||||||
|
outgoing_activity = db.query(models.OutgoingActivity).one()
|
||||||
|
assert outgoing_activity.is_sent is False
|
||||||
|
assert outgoing_activity.error is not None
|
||||||
|
assert outgoing_activity.tries == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_process_next_outgoing_activity__errored(
|
||||||
|
db: Session,
|
||||||
|
respx_mock: respx.MockRouter,
|
||||||
|
) -> None:
|
||||||
|
outbox_object = _setup_outbox_object()
|
||||||
|
recipient_inbox_url = "https://example.com/inbox"
|
||||||
|
respx_mock.post(recipient_inbox_url).mock(
|
||||||
|
return_value=httpx.Response(500, text="oops")
|
||||||
|
)
|
||||||
|
|
||||||
|
# And an outgoing activity
|
||||||
|
outgoing_activity = factories.OutgoingActivityFactory(
|
||||||
|
recipient=recipient_inbox_url,
|
||||||
|
outbox_object_id=outbox_object.id,
|
||||||
|
tries=_MAX_RETRIES - 1,
|
||||||
|
)
|
||||||
|
|
||||||
|
# When processing the next outgoing activity
|
||||||
|
# Then it is processed
|
||||||
|
assert process_next_outgoing_activity(db) is True
|
||||||
|
|
||||||
|
assert respx_mock.calls.call_count == 1
|
||||||
|
|
||||||
|
outgoing_activity = db.query(models.OutgoingActivity).one()
|
||||||
|
assert outgoing_activity.is_sent is False
|
||||||
|
assert outgoing_activity.last_status_code == 500
|
||||||
|
assert outgoing_activity.last_response == "oops"
|
||||||
|
assert outgoing_activity.is_errored is True
|
||||||
|
|
||||||
|
# And it is skipped from processing
|
||||||
|
assert process_next_outgoing_activity(db) is False
|
||||||
|
|
||||||
|
|
||||||
|
# TODO(ts):
|
||||||
|
# - parse retry after
|
30
tests/test_public.py
Normal file
30
tests/test_public.py
Normal file
|
@ -0,0 +1,30 @@
|
||||||
|
import pytest
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
|
||||||
|
from app.database import Session
|
||||||
|
|
||||||
|
_ACCEPTED_AP_HEADERS = [
|
||||||
|
"application/activity+json",
|
||||||
|
"application/activity+json; charset=utf-8",
|
||||||
|
"application/ld+json",
|
||||||
|
'application/ld+json; profile="https://www.w3.org/ns/activitystreams"',
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.anyio
|
||||||
|
def test_index(db: Session, client: TestClient):
|
||||||
|
response = client.get("/")
|
||||||
|
assert response.status_code == 200
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize("accept", _ACCEPTED_AP_HEADERS)
|
||||||
|
def test__ap_version(client, db, accept: str) -> None:
|
||||||
|
response = client.get("/followers", headers={"Accept": accept})
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.headers["content-type"] == "application/activity+json"
|
||||||
|
assert response.json()["id"].endswith("/followers")
|
||||||
|
|
||||||
|
|
||||||
|
def test__html(client, db) -> None:
|
||||||
|
response = client.get("/followers", headers={"Accept": "application/activity+json"})
|
||||||
|
assert response.status_code == 200
|
29
tests/utils.py
Normal file
29
tests/utils.py
Normal file
|
@ -0,0 +1,29 @@
|
||||||
|
from contextlib import contextmanager
|
||||||
|
|
||||||
|
import fastapi
|
||||||
|
|
||||||
|
from app import actor
|
||||||
|
from app import httpsig
|
||||||
|
from app.config import session_serializer
|
||||||
|
from app.main import app
|
||||||
|
|
||||||
|
|
||||||
|
@contextmanager
|
||||||
|
def mock_httpsig_checker(ra: actor.RemoteActor):
|
||||||
|
async def httpsig_checker(
|
||||||
|
request: fastapi.Request,
|
||||||
|
) -> httpsig.HTTPSigInfo:
|
||||||
|
return httpsig.HTTPSigInfo(
|
||||||
|
has_valid_signature=True,
|
||||||
|
signed_by_ap_actor_id=ra.ap_id,
|
||||||
|
)
|
||||||
|
|
||||||
|
app.dependency_overrides[httpsig.httpsig_checker] = httpsig_checker
|
||||||
|
try:
|
||||||
|
yield
|
||||||
|
finally:
|
||||||
|
del app.dependency_overrides[httpsig.httpsig_checker]
|
||||||
|
|
||||||
|
|
||||||
|
def generate_admin_session_cookies() -> dict[str, str]:
|
||||||
|
return {"session": session_serializer.dumps({"is_logged_in": True})}
|
Loading…
Reference in a new issue