Initial commit on new monorepo
This commit is contained in:
commit
433e92b289
51
.gitignore
vendored
Normal file
51
.gitignore
vendored
Normal file
@ -0,0 +1,51 @@
|
||||
# See http://help.github.com/ignore-files/ for more about ignoring files.
|
||||
|
||||
# Python stuff
|
||||
venv/
|
||||
__pycache__/
|
||||
|
||||
# Compiled output
|
||||
dist
|
||||
tmp
|
||||
out-tsc
|
||||
bazel-out
|
||||
|
||||
# Node
|
||||
node_modules
|
||||
npm-debug.log
|
||||
yarn-error.log
|
||||
|
||||
# IDEs and editors
|
||||
.idea/
|
||||
.project
|
||||
.classpath
|
||||
.c9/
|
||||
*.launch
|
||||
.settings/
|
||||
*.sublime-workspace
|
||||
*.iml
|
||||
|
||||
# Visual Studio Code
|
||||
.vscode/*
|
||||
!.vscode/settings.json
|
||||
!.vscode/tasks.json
|
||||
!.vscode/launch.json
|
||||
!.vscode/extensions.json
|
||||
.history/*
|
||||
|
||||
# Miscellaneous
|
||||
.angular
|
||||
.sass-cache/
|
||||
connect.lock
|
||||
coverage
|
||||
libpeerconnection.log
|
||||
testem.log
|
||||
typings
|
||||
google.json
|
||||
*.versionsBackup
|
||||
dbs/
|
||||
target/
|
||||
|
||||
# System files
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
674
LICENSE
Normal file
674
LICENSE
Normal file
@ -0,0 +1,674 @@
|
||||
GNU GENERAL PUBLIC LICENSE
|
||||
Version 3, 29 June 2007
|
||||
|
||||
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
Preamble
|
||||
|
||||
The GNU General Public License is a free, copyleft license for
|
||||
software and other kinds of works.
|
||||
|
||||
The licenses for most software and other practical works are designed
|
||||
to take away your freedom to share and change the works. By contrast,
|
||||
the GNU General Public License is intended to guarantee your freedom to
|
||||
share and change all versions of a program--to make sure it remains free
|
||||
software for all its users. We, the Free Software Foundation, use the
|
||||
GNU General Public License for most of our software; it applies also to
|
||||
any other work released this way by its authors. You can apply it to
|
||||
your programs, too.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
them if you wish), that you receive source code or can get it if you
|
||||
want it, that you can change the software or use pieces of it in new
|
||||
free programs, and that you know you can do these things.
|
||||
|
||||
To protect your rights, we need to prevent others from denying you
|
||||
these rights or asking you to surrender the rights. Therefore, you have
|
||||
certain responsibilities if you distribute copies of the software, or if
|
||||
you modify it: responsibilities to respect the freedom of others.
|
||||
|
||||
For example, if you distribute copies of such a program, whether
|
||||
gratis or for a fee, you must pass on to the recipients the same
|
||||
freedoms that you received. You must make sure that they, too, receive
|
||||
or can get the source code. And you must show them these terms so they
|
||||
know their rights.
|
||||
|
||||
Developers that use the GNU GPL protect your rights with two steps:
|
||||
(1) assert copyright on the software, and (2) offer you this License
|
||||
giving you legal permission to copy, distribute and/or modify it.
|
||||
|
||||
For the developers' and authors' protection, the GPL clearly explains
|
||||
that there is no warranty for this free software. For both users' and
|
||||
authors' sake, the GPL requires that modified versions be marked as
|
||||
changed, so that their problems will not be attributed erroneously to
|
||||
authors of previous versions.
|
||||
|
||||
Some devices are designed to deny users access to install or run
|
||||
modified versions of the software inside them, although the manufacturer
|
||||
can do so. This is fundamentally incompatible with the aim of
|
||||
protecting users' freedom to change the software. The systematic
|
||||
pattern of such abuse occurs in the area of products for individuals to
|
||||
use, which is precisely where it is most unacceptable. Therefore, we
|
||||
have designed this version of the GPL to prohibit the practice for those
|
||||
products. If such problems arise substantially in other domains, we
|
||||
stand ready to extend this provision to those domains in future versions
|
||||
of the GPL, as needed to protect the freedom of users.
|
||||
|
||||
Finally, every program is threatened constantly by software patents.
|
||||
States should not allow patents to restrict development and use of
|
||||
software on general-purpose computers, but in those that do, we wish to
|
||||
avoid the special danger that patents applied to a free program could
|
||||
make it effectively proprietary. To prevent this, the GPL assures that
|
||||
patents cannot be used to render the program non-free.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
TERMS AND CONDITIONS
|
||||
|
||||
0. Definitions.
|
||||
|
||||
"This License" refers to version 3 of the GNU General Public License.
|
||||
|
||||
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||
works, such as semiconductor masks.
|
||||
|
||||
"The Program" refers to any copyrightable work licensed under this
|
||||
License. Each licensee is addressed as "you". "Licensees" and
|
||||
"recipients" may be individuals or organizations.
|
||||
|
||||
To "modify" a work means to copy from or adapt all or part of the work
|
||||
in a fashion requiring copyright permission, other than the making of an
|
||||
exact copy. The resulting work is called a "modified version" of the
|
||||
earlier work or a work "based on" the earlier work.
|
||||
|
||||
A "covered work" means either the unmodified Program or a work based
|
||||
on the Program.
|
||||
|
||||
To "propagate" a work means to do anything with it that, without
|
||||
permission, would make you directly or secondarily liable for
|
||||
infringement under applicable copyright law, except executing it on a
|
||||
computer or modifying a private copy. Propagation includes copying,
|
||||
distribution (with or without modification), making available to the
|
||||
public, and in some countries other activities as well.
|
||||
|
||||
To "convey" a work means any kind of propagation that enables other
|
||||
parties to make or receive copies. Mere interaction with a user through
|
||||
a computer network, with no transfer of a copy, is not conveying.
|
||||
|
||||
An interactive user interface displays "Appropriate Legal Notices"
|
||||
to the extent that it includes a convenient and prominently visible
|
||||
feature that (1) displays an appropriate copyright notice, and (2)
|
||||
tells the user that there is no warranty for the work (except to the
|
||||
extent that warranties are provided), that licensees may convey the
|
||||
work under this License, and how to view a copy of this License. If
|
||||
the interface presents a list of user commands or options, such as a
|
||||
menu, a prominent item in the list meets this criterion.
|
||||
|
||||
1. Source Code.
|
||||
|
||||
The "source code" for a work means the preferred form of the work
|
||||
for making modifications to it. "Object code" means any non-source
|
||||
form of a work.
|
||||
|
||||
A "Standard Interface" means an interface that either is an official
|
||||
standard defined by a recognized standards body, or, in the case of
|
||||
interfaces specified for a particular programming language, one that
|
||||
is widely used among developers working in that language.
|
||||
|
||||
The "System Libraries" of an executable work include anything, other
|
||||
than the work as a whole, that (a) is included in the normal form of
|
||||
packaging a Major Component, but which is not part of that Major
|
||||
Component, and (b) serves only to enable use of the work with that
|
||||
Major Component, or to implement a Standard Interface for which an
|
||||
implementation is available to the public in source code form. A
|
||||
"Major Component", in this context, means a major essential component
|
||||
(kernel, window system, and so on) of the specific operating system
|
||||
(if any) on which the executable work runs, or a compiler used to
|
||||
produce the work, or an object code interpreter used to run it.
|
||||
|
||||
The "Corresponding Source" for a work in object code form means all
|
||||
the source code needed to generate, install, and (for an executable
|
||||
work) run the object code and to modify the work, including scripts to
|
||||
control those activities. However, it does not include the work's
|
||||
System Libraries, or general-purpose tools or generally available free
|
||||
programs which are used unmodified in performing those activities but
|
||||
which are not part of the work. For example, Corresponding Source
|
||||
includes interface definition files associated with source files for
|
||||
the work, and the source code for shared libraries and dynamically
|
||||
linked subprograms that the work is specifically designed to require,
|
||||
such as by intimate data communication or control flow between those
|
||||
subprograms and other parts of the work.
|
||||
|
||||
The Corresponding Source need not include anything that users
|
||||
can regenerate automatically from other parts of the Corresponding
|
||||
Source.
|
||||
|
||||
The Corresponding Source for a work in source code form is that
|
||||
same work.
|
||||
|
||||
2. Basic Permissions.
|
||||
|
||||
All rights granted under this License are granted for the term of
|
||||
copyright on the Program, and are irrevocable provided the stated
|
||||
conditions are met. This License explicitly affirms your unlimited
|
||||
permission to run the unmodified Program. The output from running a
|
||||
covered work is covered by this License only if the output, given its
|
||||
content, constitutes a covered work. This License acknowledges your
|
||||
rights of fair use or other equivalent, as provided by copyright law.
|
||||
|
||||
You may make, run and propagate covered works that you do not
|
||||
convey, without conditions so long as your license otherwise remains
|
||||
in force. You may convey covered works to others for the sole purpose
|
||||
of having them make modifications exclusively for you, or provide you
|
||||
with facilities for running those works, provided that you comply with
|
||||
the terms of this License in conveying all material for which you do
|
||||
not control copyright. Those thus making or running the covered works
|
||||
for you must do so exclusively on your behalf, under your direction
|
||||
and control, on terms that prohibit them from making any copies of
|
||||
your copyrighted material outside their relationship with you.
|
||||
|
||||
Conveying under any other circumstances is permitted solely under
|
||||
the conditions stated below. Sublicensing is not allowed; section 10
|
||||
makes it unnecessary.
|
||||
|
||||
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||
|
||||
No covered work shall be deemed part of an effective technological
|
||||
measure under any applicable law fulfilling obligations under article
|
||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||
similar laws prohibiting or restricting circumvention of such
|
||||
measures.
|
||||
|
||||
When you convey a covered work, you waive any legal power to forbid
|
||||
circumvention of technological measures to the extent such circumvention
|
||||
is effected by exercising rights under this License with respect to
|
||||
the covered work, and you disclaim any intention to limit operation or
|
||||
modification of the work as a means of enforcing, against the work's
|
||||
users, your or third parties' legal rights to forbid circumvention of
|
||||
technological measures.
|
||||
|
||||
4. Conveying Verbatim Copies.
|
||||
|
||||
You may convey verbatim copies of the Program's source code as you
|
||||
receive it, in any medium, provided that you conspicuously and
|
||||
appropriately publish on each copy an appropriate copyright notice;
|
||||
keep intact all notices stating that this License and any
|
||||
non-permissive terms added in accord with section 7 apply to the code;
|
||||
keep intact all notices of the absence of any warranty; and give all
|
||||
recipients a copy of this License along with the Program.
|
||||
|
||||
You may charge any price or no price for each copy that you convey,
|
||||
and you may offer support or warranty protection for a fee.
|
||||
|
||||
5. Conveying Modified Source Versions.
|
||||
|
||||
You may convey a work based on the Program, or the modifications to
|
||||
produce it from the Program, in the form of source code under the
|
||||
terms of section 4, provided that you also meet all of these conditions:
|
||||
|
||||
a) The work must carry prominent notices stating that you modified
|
||||
it, and giving a relevant date.
|
||||
|
||||
b) The work must carry prominent notices stating that it is
|
||||
released under this License and any conditions added under section
|
||||
7. This requirement modifies the requirement in section 4 to
|
||||
"keep intact all notices".
|
||||
|
||||
c) You must license the entire work, as a whole, under this
|
||||
License to anyone who comes into possession of a copy. This
|
||||
License will therefore apply, along with any applicable section 7
|
||||
additional terms, to the whole of the work, and all its parts,
|
||||
regardless of how they are packaged. This License gives no
|
||||
permission to license the work in any other way, but it does not
|
||||
invalidate such permission if you have separately received it.
|
||||
|
||||
d) If the work has interactive user interfaces, each must display
|
||||
Appropriate Legal Notices; however, if the Program has interactive
|
||||
interfaces that do not display Appropriate Legal Notices, your
|
||||
work need not make them do so.
|
||||
|
||||
A compilation of a covered work with other separate and independent
|
||||
works, which are not by their nature extensions of the covered work,
|
||||
and which are not combined with it such as to form a larger program,
|
||||
in or on a volume of a storage or distribution medium, is called an
|
||||
"aggregate" if the compilation and its resulting copyright are not
|
||||
used to limit the access or legal rights of the compilation's users
|
||||
beyond what the individual works permit. Inclusion of a covered work
|
||||
in an aggregate does not cause this License to apply to the other
|
||||
parts of the aggregate.
|
||||
|
||||
6. Conveying Non-Source Forms.
|
||||
|
||||
You may convey a covered work in object code form under the terms
|
||||
of sections 4 and 5, provided that you also convey the
|
||||
machine-readable Corresponding Source under the terms of this License,
|
||||
in one of these ways:
|
||||
|
||||
a) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by the
|
||||
Corresponding Source fixed on a durable physical medium
|
||||
customarily used for software interchange.
|
||||
|
||||
b) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by a
|
||||
written offer, valid for at least three years and valid for as
|
||||
long as you offer spare parts or customer support for that product
|
||||
model, to give anyone who possesses the object code either (1) a
|
||||
copy of the Corresponding Source for all the software in the
|
||||
product that is covered by this License, on a durable physical
|
||||
medium customarily used for software interchange, for a price no
|
||||
more than your reasonable cost of physically performing this
|
||||
conveying of source, or (2) access to copy the
|
||||
Corresponding Source from a network server at no charge.
|
||||
|
||||
c) Convey individual copies of the object code with a copy of the
|
||||
written offer to provide the Corresponding Source. This
|
||||
alternative is allowed only occasionally and noncommercially, and
|
||||
only if you received the object code with such an offer, in accord
|
||||
with subsection 6b.
|
||||
|
||||
d) Convey the object code by offering access from a designated
|
||||
place (gratis or for a charge), and offer equivalent access to the
|
||||
Corresponding Source in the same way through the same place at no
|
||||
further charge. You need not require recipients to copy the
|
||||
Corresponding Source along with the object code. If the place to
|
||||
copy the object code is a network server, the Corresponding Source
|
||||
may be on a different server (operated by you or a third party)
|
||||
that supports equivalent copying facilities, provided you maintain
|
||||
clear directions next to the object code saying where to find the
|
||||
Corresponding Source. Regardless of what server hosts the
|
||||
Corresponding Source, you remain obligated to ensure that it is
|
||||
available for as long as needed to satisfy these requirements.
|
||||
|
||||
e) Convey the object code using peer-to-peer transmission, provided
|
||||
you inform other peers where the object code and Corresponding
|
||||
Source of the work are being offered to the general public at no
|
||||
charge under subsection 6d.
|
||||
|
||||
A separable portion of the object code, whose source code is excluded
|
||||
from the Corresponding Source as a System Library, need not be
|
||||
included in conveying the object code work.
|
||||
|
||||
A "User Product" is either (1) a "consumer product", which means any
|
||||
tangible personal property which is normally used for personal, family,
|
||||
or household purposes, or (2) anything designed or sold for incorporation
|
||||
into a dwelling. In determining whether a product is a consumer product,
|
||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||
product received by a particular user, "normally used" refers to a
|
||||
typical or common use of that class of product, regardless of the status
|
||||
of the particular user or of the way in which the particular user
|
||||
actually uses, or expects or is expected to use, the product. A product
|
||||
is a consumer product regardless of whether the product has substantial
|
||||
commercial, industrial or non-consumer uses, unless such uses represent
|
||||
the only significant mode of use of the product.
|
||||
|
||||
"Installation Information" for a User Product means any methods,
|
||||
procedures, authorization keys, or other information required to install
|
||||
and execute modified versions of a covered work in that User Product from
|
||||
a modified version of its Corresponding Source. The information must
|
||||
suffice to ensure that the continued functioning of the modified object
|
||||
code is in no case prevented or interfered with solely because
|
||||
modification has been made.
|
||||
|
||||
If you convey an object code work under this section in, or with, or
|
||||
specifically for use in, a User Product, and the conveying occurs as
|
||||
part of a transaction in which the right of possession and use of the
|
||||
User Product is transferred to the recipient in perpetuity or for a
|
||||
fixed term (regardless of how the transaction is characterized), the
|
||||
Corresponding Source conveyed under this section must be accompanied
|
||||
by the Installation Information. But this requirement does not apply
|
||||
if neither you nor any third party retains the ability to install
|
||||
modified object code on the User Product (for example, the work has
|
||||
been installed in ROM).
|
||||
|
||||
The requirement to provide Installation Information does not include a
|
||||
requirement to continue to provide support service, warranty, or updates
|
||||
for a work that has been modified or installed by the recipient, or for
|
||||
the User Product in which it has been modified or installed. Access to a
|
||||
network may be denied when the modification itself materially and
|
||||
adversely affects the operation of the network or violates the rules and
|
||||
protocols for communication across the network.
|
||||
|
||||
Corresponding Source conveyed, and Installation Information provided,
|
||||
in accord with this section must be in a format that is publicly
|
||||
documented (and with an implementation available to the public in
|
||||
source code form), and must require no special password or key for
|
||||
unpacking, reading or copying.
|
||||
|
||||
7. Additional Terms.
|
||||
|
||||
"Additional permissions" are terms that supplement the terms of this
|
||||
License by making exceptions from one or more of its conditions.
|
||||
Additional permissions that are applicable to the entire Program shall
|
||||
be treated as though they were included in this License, to the extent
|
||||
that they are valid under applicable law. If additional permissions
|
||||
apply only to part of the Program, that part may be used separately
|
||||
under those permissions, but the entire Program remains governed by
|
||||
this License without regard to the additional permissions.
|
||||
|
||||
When you convey a copy of a covered work, you may at your option
|
||||
remove any additional permissions from that copy, or from any part of
|
||||
it. (Additional permissions may be written to require their own
|
||||
removal in certain cases when you modify the work.) You may place
|
||||
additional permissions on material, added by you to a covered work,
|
||||
for which you have or can give appropriate copyright permission.
|
||||
|
||||
Notwithstanding any other provision of this License, for material you
|
||||
add to a covered work, you may (if authorized by the copyright holders of
|
||||
that material) supplement the terms of this License with terms:
|
||||
|
||||
a) Disclaiming warranty or limiting liability differently from the
|
||||
terms of sections 15 and 16 of this License; or
|
||||
|
||||
b) Requiring preservation of specified reasonable legal notices or
|
||||
author attributions in that material or in the Appropriate Legal
|
||||
Notices displayed by works containing it; or
|
||||
|
||||
c) Prohibiting misrepresentation of the origin of that material, or
|
||||
requiring that modified versions of such material be marked in
|
||||
reasonable ways as different from the original version; or
|
||||
|
||||
d) Limiting the use for publicity purposes of names of licensors or
|
||||
authors of the material; or
|
||||
|
||||
e) Declining to grant rights under trademark law for use of some
|
||||
trade names, trademarks, or service marks; or
|
||||
|
||||
f) Requiring indemnification of licensors and authors of that
|
||||
material by anyone who conveys the material (or modified versions of
|
||||
it) with contractual assumptions of liability to the recipient, for
|
||||
any liability that these contractual assumptions directly impose on
|
||||
those licensors and authors.
|
||||
|
||||
All other non-permissive additional terms are considered "further
|
||||
restrictions" within the meaning of section 10. If the Program as you
|
||||
received it, or any part of it, contains a notice stating that it is
|
||||
governed by this License along with a term that is a further
|
||||
restriction, you may remove that term. If a license document contains
|
||||
a further restriction but permits relicensing or conveying under this
|
||||
License, you may add to a covered work material governed by the terms
|
||||
of that license document, provided that the further restriction does
|
||||
not survive such relicensing or conveying.
|
||||
|
||||
If you add terms to a covered work in accord with this section, you
|
||||
must place, in the relevant source files, a statement of the
|
||||
additional terms that apply to those files, or a notice indicating
|
||||
where to find the applicable terms.
|
||||
|
||||
Additional terms, permissive or non-permissive, may be stated in the
|
||||
form of a separately written license, or stated as exceptions;
|
||||
the above requirements apply either way.
|
||||
|
||||
8. Termination.
|
||||
|
||||
You may not propagate or modify a covered work except as expressly
|
||||
provided under this License. Any attempt otherwise to propagate or
|
||||
modify it is void, and will automatically terminate your rights under
|
||||
this License (including any patent licenses granted under the third
|
||||
paragraph of section 11).
|
||||
|
||||
However, if you cease all violation of this License, then your
|
||||
license from a particular copyright holder is reinstated (a)
|
||||
provisionally, unless and until the copyright holder explicitly and
|
||||
finally terminates your license, and (b) permanently, if the copyright
|
||||
holder fails to notify you of the violation by some reasonable means
|
||||
prior to 60 days after the cessation.
|
||||
|
||||
Moreover, your license from a particular copyright holder is
|
||||
reinstated permanently if the copyright holder notifies you of the
|
||||
violation by some reasonable means, this is the first time you have
|
||||
received notice of violation of this License (for any work) from that
|
||||
copyright holder, and you cure the violation prior to 30 days after
|
||||
your receipt of the notice.
|
||||
|
||||
Termination of your rights under this section does not terminate the
|
||||
licenses of parties who have received copies or rights from you under
|
||||
this License. If your rights have been terminated and not permanently
|
||||
reinstated, you do not qualify to receive new licenses for the same
|
||||
material under section 10.
|
||||
|
||||
9. Acceptance Not Required for Having Copies.
|
||||
|
||||
You are not required to accept this License in order to receive or
|
||||
run a copy of the Program. Ancillary propagation of a covered work
|
||||
occurring solely as a consequence of using peer-to-peer transmission
|
||||
to receive a copy likewise does not require acceptance. However,
|
||||
nothing other than this License grants you permission to propagate or
|
||||
modify any covered work. These actions infringe copyright if you do
|
||||
not accept this License. Therefore, by modifying or propagating a
|
||||
covered work, you indicate your acceptance of this License to do so.
|
||||
|
||||
10. Automatic Licensing of Downstream Recipients.
|
||||
|
||||
Each time you convey a covered work, the recipient automatically
|
||||
receives a license from the original licensors, to run, modify and
|
||||
propagate that work, subject to this License. You are not responsible
|
||||
for enforcing compliance by third parties with this License.
|
||||
|
||||
An "entity transaction" is a transaction transferring control of an
|
||||
organization, or substantially all assets of one, or subdividing an
|
||||
organization, or merging organizations. If propagation of a covered
|
||||
work results from an entity transaction, each party to that
|
||||
transaction who receives a copy of the work also receives whatever
|
||||
licenses to the work the party's predecessor in interest had or could
|
||||
give under the previous paragraph, plus a right to possession of the
|
||||
Corresponding Source of the work from the predecessor in interest, if
|
||||
the predecessor has it or can get it with reasonable efforts.
|
||||
|
||||
You may not impose any further restrictions on the exercise of the
|
||||
rights granted or affirmed under this License. For example, you may
|
||||
not impose a license fee, royalty, or other charge for exercise of
|
||||
rights granted under this License, and you may not initiate litigation
|
||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||
any patent claim is infringed by making, using, selling, offering for
|
||||
sale, or importing the Program or any portion of it.
|
||||
|
||||
11. Patents.
|
||||
|
||||
A "contributor" is a copyright holder who authorizes use under this
|
||||
License of the Program or a work on which the Program is based. The
|
||||
work thus licensed is called the contributor's "contributor version".
|
||||
|
||||
A contributor's "essential patent claims" are all patent claims
|
||||
owned or controlled by the contributor, whether already acquired or
|
||||
hereafter acquired, that would be infringed by some manner, permitted
|
||||
by this License, of making, using, or selling its contributor version,
|
||||
but do not include claims that would be infringed only as a
|
||||
consequence of further modification of the contributor version. For
|
||||
purposes of this definition, "control" includes the right to grant
|
||||
patent sublicenses in a manner consistent with the requirements of
|
||||
this License.
|
||||
|
||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||
patent license under the contributor's essential patent claims, to
|
||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||
propagate the contents of its contributor version.
|
||||
|
||||
In the following three paragraphs, a "patent license" is any express
|
||||
agreement or commitment, however denominated, not to enforce a patent
|
||||
(such as an express permission to practice a patent or covenant not to
|
||||
sue for patent infringement). To "grant" such a patent license to a
|
||||
party means to make such an agreement or commitment not to enforce a
|
||||
patent against the party.
|
||||
|
||||
If you convey a covered work, knowingly relying on a patent license,
|
||||
and the Corresponding Source of the work is not available for anyone
|
||||
to copy, free of charge and under the terms of this License, through a
|
||||
publicly available network server or other readily accessible means,
|
||||
then you must either (1) cause the Corresponding Source to be so
|
||||
available, or (2) arrange to deprive yourself of the benefit of the
|
||||
patent license for this particular work, or (3) arrange, in a manner
|
||||
consistent with the requirements of this License, to extend the patent
|
||||
license to downstream recipients. "Knowingly relying" means you have
|
||||
actual knowledge that, but for the patent license, your conveying the
|
||||
covered work in a country, or your recipient's use of the covered work
|
||||
in a country, would infringe one or more identifiable patents in that
|
||||
country that you have reason to believe are valid.
|
||||
|
||||
If, pursuant to or in connection with a single transaction or
|
||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||
covered work, and grant a patent license to some of the parties
|
||||
receiving the covered work authorizing them to use, propagate, modify
|
||||
or convey a specific copy of the covered work, then the patent license
|
||||
you grant is automatically extended to all recipients of the covered
|
||||
work and works based on it.
|
||||
|
||||
A patent license is "discriminatory" if it does not include within
|
||||
the scope of its coverage, prohibits the exercise of, or is
|
||||
conditioned on the non-exercise of one or more of the rights that are
|
||||
specifically granted under this License. You may not convey a covered
|
||||
work if you are a party to an arrangement with a third party that is
|
||||
in the business of distributing software, under which you make payment
|
||||
to the third party based on the extent of your activity of conveying
|
||||
the work, and under which the third party grants, to any of the
|
||||
parties who would receive the covered work from you, a discriminatory
|
||||
patent license (a) in connection with copies of the covered work
|
||||
conveyed by you (or copies made from those copies), or (b) primarily
|
||||
for and in connection with specific products or compilations that
|
||||
contain the covered work, unless you entered into that arrangement,
|
||||
or that patent license was granted, prior to 28 March 2007.
|
||||
|
||||
Nothing in this License shall be construed as excluding or limiting
|
||||
any implied license or other defenses to infringement that may
|
||||
otherwise be available to you under applicable patent law.
|
||||
|
||||
12. No Surrender of Others' Freedom.
|
||||
|
||||
If conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot convey a
|
||||
covered work so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you may
|
||||
not convey it at all. For example, if you agree to terms that obligate you
|
||||
to collect a royalty for further conveying from those to whom you convey
|
||||
the Program, the only way you could satisfy both those terms and this
|
||||
License would be to refrain entirely from conveying the Program.
|
||||
|
||||
13. Use with the GNU Affero General Public License.
|
||||
|
||||
Notwithstanding any other provision of this License, you have
|
||||
permission to link or combine any covered work with a work licensed
|
||||
under version 3 of the GNU Affero General Public License into a single
|
||||
combined work, and to convey the resulting work. The terms of this
|
||||
License will continue to apply to the part which is the covered work,
|
||||
but the special requirements of the GNU Affero General Public License,
|
||||
section 13, concerning interaction through a network will apply to the
|
||||
combination as such.
|
||||
|
||||
14. Revised Versions of this License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions of
|
||||
the GNU General Public License from time to time. Such new versions will
|
||||
be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Program specifies that a certain numbered version of the GNU General
|
||||
Public License "or any later version" applies to it, you have the
|
||||
option of following the terms and conditions either of that numbered
|
||||
version or of any later version published by the Free Software
|
||||
Foundation. If the Program does not specify a version number of the
|
||||
GNU General Public License, you may choose any version ever published
|
||||
by the Free Software Foundation.
|
||||
|
||||
If the Program specifies that a proxy can decide which future
|
||||
versions of the GNU General Public License can be used, that proxy's
|
||||
public statement of acceptance of a version permanently authorizes you
|
||||
to choose that version for the Program.
|
||||
|
||||
Later license versions may give you additional or different
|
||||
permissions. However, no additional obligations are imposed on any
|
||||
author or copyright holder as a result of your choosing to follow a
|
||||
later version.
|
||||
|
||||
15. Disclaimer of Warranty.
|
||||
|
||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||
|
||||
16. Limitation of Liability.
|
||||
|
||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||
SUCH DAMAGES.
|
||||
|
||||
17. Interpretation of Sections 15 and 16.
|
||||
|
||||
If the disclaimer of warranty and limitation of liability provided
|
||||
above cannot be given local legal effect according to their terms,
|
||||
reviewing courts shall apply local law that most closely approximates
|
||||
an absolute waiver of all civil liability in connection with the
|
||||
Program, unless a warranty or assumption of liability accompanies a
|
||||
copy of the Program in return for a fee.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
state the exclusion of warranty; and each file should have at least
|
||||
the "copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
<one line to give the program's name and a brief idea of what it does.>
|
||||
Copyright (C) <year> <name of author>
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation, either version 3 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU General Public License
|
||||
along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If the program does terminal interaction, make it output a short
|
||||
notice like this when it starts in an interactive mode:
|
||||
|
||||
<program> Copyright (C) <year> <name of author>
|
||||
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
|
||||
This is free software, and you are welcome to redistribute it
|
||||
under certain conditions; type `show c' for details.
|
||||
|
||||
The hypothetical commands `show w' and `show c' should show the appropriate
|
||||
parts of the General Public License. Of course, your program's commands
|
||||
might be different; for a GUI interface, you would use an "about box".
|
||||
|
||||
You should also get your employer (if you work as a programmer) or school,
|
||||
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||
For more information on this, and how to apply and follow the GNU GPL, see
|
||||
<https://www.gnu.org/licenses/>.
|
||||
|
||||
The GNU General Public License does not permit incorporating your program
|
||||
into proprietary programs. If your program is a subroutine library, you
|
||||
may consider it more useful to permit linking proprietary applications with
|
||||
the library. If this is what you want to do, use the GNU Lesser General
|
||||
Public License instead of this License. But first, please read
|
||||
<https://www.gnu.org/licenses/why-not-lgpl.html>.
|
||||
17
deployment/.env
Normal file
17
deployment/.env
Normal file
@ -0,0 +1,17 @@
|
||||
; Email for Let's Encrypt SSL certificate expiration notifications
|
||||
EMAIL=
|
||||
|
||||
; Postgres database connection
|
||||
DB_HOST=
|
||||
DB_USER=
|
||||
DB_PASS=
|
||||
DB_NAME=
|
||||
|
||||
; Discord integration
|
||||
WARNINGS_WEBHOOK_URL=
|
||||
SCORES_WEBHOOK_URL=
|
||||
|
||||
; osu!api
|
||||
OSU_API_KEY=
|
||||
OSU_CLIENT_ID=
|
||||
OSU_CLIENT_SECRET=
|
||||
107
deployment/docker-compose.yml
Normal file
107
deployment/docker-compose.yml
Normal file
@ -0,0 +1,107 @@
|
||||
version: '3'
|
||||
|
||||
services:
|
||||
|
||||
traefik:
|
||||
image: "traefik:v2.9"
|
||||
container_name: "traefik"
|
||||
restart: always
|
||||
environment:
|
||||
EMAIL: ${EMAIL}
|
||||
command:
|
||||
- "--log.level=ERROR"
|
||||
- "--api.insecure=false"
|
||||
- "--providers.docker=true"
|
||||
- "--providers.docker.exposedbydefault=false"
|
||||
- "--entrypoints.web.address=:80"
|
||||
- "--entrypoints.web.http.redirections.entrypoint.to=websecure"
|
||||
- "--entrypoints.web.http.redirections.entrypoint.scheme=https"
|
||||
- "--entrypoints.websecure.address=:443"
|
||||
- "--certificatesresolvers.myresolver.acme.tlschallenge=true"
|
||||
- "--certificatesresolvers.myresolver.acme.email=${EMAIL}"
|
||||
- "--certificatesresolvers.myresolver.acme.storage=/letsencrypt/acme.json"
|
||||
ports:
|
||||
- "443:443"
|
||||
- "80:80"
|
||||
volumes:
|
||||
- "./letsencrypt:/letsencrypt"
|
||||
- "/var/run/docker.sock:/var/run/docker.sock:ro"
|
||||
|
||||
postgres:
|
||||
image: groonga/pgroonga:3.1.6-alpine-15
|
||||
container_name: postgres
|
||||
restart: always
|
||||
environment:
|
||||
POSTGRES_USER: ${DB_USER}
|
||||
POSTGRES_PASSWORD: ${DB_PASS}
|
||||
volumes:
|
||||
- postgres-data:/var/lib/postgresql/data
|
||||
command: >
|
||||
-c work_mem=4MB
|
||||
shm_size: '128mb'
|
||||
|
||||
redis:
|
||||
image: redis:alpine
|
||||
container_name: redis
|
||||
restart: always
|
||||
|
||||
nise-backend:
|
||||
image: git.gengo.tech/gengotech/nise-backend:latest
|
||||
container_name: nise-backend
|
||||
environment:
|
||||
SPRING_PROFILES_ACTIVE: postgres,discord,updater
|
||||
# App configuration
|
||||
OLD_SCORES_PAGE_SIZE: 5000
|
||||
# Origin
|
||||
ORIGIN: "https://nise.moe"
|
||||
# Postgres
|
||||
POSTGRES_HOST: ${DB_HOST}
|
||||
POSTGRES_USER: ${DB_USER}
|
||||
POSTGRES_PASS: ${DB_PASS}
|
||||
POSTGRES_DB: ${DB_NAME}
|
||||
# redis
|
||||
REDIS_DB: 4
|
||||
# Discord
|
||||
WEBHOOK_URL: ${WARNINGS_WEBHOOK_URL}
|
||||
SCORES_WEBHOOK_URL: ${SCORES_WEBHOOK_URL}
|
||||
# osu!api
|
||||
OSU_API_KEY: ${OSU_API_KEY}
|
||||
OSU_CLIENT_ID: ${OSU_CLIENT_ID}
|
||||
OSU_CLIENT_SECRET: ${OSU_CLIENT_SECRET}
|
||||
# Internal API
|
||||
CIRCLEGUARD_API_URL: http://nise-circleguard:5000
|
||||
restart: always
|
||||
labels:
|
||||
- "traefik.enable=true"
|
||||
- "traefik.http.routers.nise.rule=Host(`nise.moe`) && PathPrefix(`/api/`)"
|
||||
- "traefik.http.routers.nise.middlewares=nise-stripprefix"
|
||||
- "traefik.http.middlewares.nise-stripprefix.stripprefix.prefixes=/api/"
|
||||
- "traefik.http.services.nise.loadbalancer.server.port=8080"
|
||||
- "traefik.http.routers.nise.entrypoints=websecure"
|
||||
- "traefik.http.routers.nise.tls.certresolver=myresolver"
|
||||
depends_on:
|
||||
- postgres
|
||||
- redis
|
||||
|
||||
nise-circleguard:
|
||||
image: git.gengo.tech/nuff/nise-circleguard:latest
|
||||
container_name: nise-circleguard
|
||||
environment:
|
||||
OSU_API_KEY: ${OSU_API_KEY}
|
||||
restart: always
|
||||
volumes:
|
||||
- ./nise-data:/app/dbs
|
||||
|
||||
nise-frontend:
|
||||
image: git.gengo.tech/gengotech/nise-frontend:latest
|
||||
container_name: nise-frontend
|
||||
restart: always
|
||||
labels:
|
||||
- "traefik.enable=true"
|
||||
- "traefik.http.routers.nise-frontend.rule=Host(`nise.moe`) && PathPrefix(`/`)"
|
||||
- "traefik.http.services.nise-frontend.loadbalancer.server.port=80"
|
||||
- "traefik.http.routers.nise-frontend.entrypoints=websecure"
|
||||
- "traefik.http.routers.nise-frontend.tls.certresolver=myresolver"
|
||||
|
||||
volumes:
|
||||
postgres-data:
|
||||
BIN
keisatsu-chan.png
Executable file
BIN
keisatsu-chan.png
Executable file
Binary file not shown.
|
After Width: | Height: | Size: 76 KiB |
108
konata/pom.xml
Normal file
108
konata/pom.xml
Normal file
@ -0,0 +1,108 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<project xmlns="http://maven.apache.org/POM/4.0.0"
|
||||
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
|
||||
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
|
||||
<modelVersion>4.0.0</modelVersion>
|
||||
|
||||
<groupId>org.nisemoe</groupId>
|
||||
<artifactId>konata</artifactId>
|
||||
<version>0.0.1-SNAPSHOT</version>
|
||||
|
||||
<properties>
|
||||
<java.version>21</java.version>
|
||||
<kotlin.version>1.9.22</kotlin.version>
|
||||
</properties>
|
||||
|
||||
<build>
|
||||
<sourceDirectory>src/main/kotlin</sourceDirectory>
|
||||
<testSourceDirectory>src/test/kotlin</testSourceDirectory>
|
||||
<plugins>
|
||||
<plugin>
|
||||
<groupId>org.jetbrains.kotlin</groupId>
|
||||
<artifactId>kotlin-maven-plugin</artifactId>
|
||||
<version>${kotlin.version}</version>
|
||||
<executions>
|
||||
<execution>
|
||||
<id>compile</id>
|
||||
<phase>compile</phase>
|
||||
<goals>
|
||||
<goal>compile</goal>
|
||||
</goals>
|
||||
</execution>
|
||||
<execution>
|
||||
<id>test-compile</id>
|
||||
<phase>test-compile</phase>
|
||||
<goals>
|
||||
<goal>test-compile</goal>
|
||||
</goals>
|
||||
</execution>
|
||||
</executions>
|
||||
</plugin>
|
||||
<plugin>
|
||||
<groupId>org.codehaus.mojo</groupId>
|
||||
<artifactId>exec-maven-plugin</artifactId>
|
||||
<version>1.6.0</version>
|
||||
<configuration>
|
||||
<mainClass>MainKt</mainClass>
|
||||
</configuration>
|
||||
</plugin>
|
||||
</plugins>
|
||||
</build>
|
||||
|
||||
<dependencies>
|
||||
<!-- For LZMA decompression of replay data -->
|
||||
<dependency>
|
||||
<groupId>org.apache.commons</groupId>
|
||||
<artifactId>commons-compress</artifactId>
|
||||
<version>1.25.0</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.tukaani</groupId>
|
||||
<artifactId>xz</artifactId>
|
||||
<version>1.9</version>
|
||||
</dependency>
|
||||
|
||||
<!-- Vector computations -->
|
||||
<dependency>
|
||||
<groupId>org.jetbrains.bio</groupId>
|
||||
<artifactId>viktor</artifactId>
|
||||
<version>1.2.0</version>
|
||||
</dependency>
|
||||
|
||||
<!-- Math computations -->
|
||||
<dependency>
|
||||
<groupId>org.apache.commons</groupId>
|
||||
<artifactId>commons-math3</artifactId>
|
||||
<version>3.6.1</version>
|
||||
</dependency>
|
||||
|
||||
|
||||
<!-- Coroutines -->
|
||||
<dependency>
|
||||
<groupId>org.jetbrains.kotlinx</groupId>
|
||||
<artifactId>kotlinx-coroutines-core</artifactId>
|
||||
<version>1.7.3</version>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.jetbrains.kotlin</groupId>
|
||||
<artifactId>kotlin-test-junit5</artifactId>
|
||||
<version>${kotlin.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.junit.jupiter</groupId>
|
||||
<artifactId>junit-jupiter</artifactId>
|
||||
<version>5.10.0</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.jetbrains.kotlin</groupId>
|
||||
<artifactId>kotlin-stdlib</artifactId>
|
||||
<version>${kotlin.version}</version>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
|
||||
</project>
|
||||
111
konata/readme.md
Normal file
111
konata/readme.md
Normal file
@ -0,0 +1,111 @@
|
||||
# konata
|
||||
|
||||
>osu! utility lib in kotlin for fast replay comparison with multithreading support
|
||||
|
||||
This module has the specific purpose of **high-throughput** replay comparison, and only works with replay data as supplied by the osu!api; it does not work with .osr files.
|
||||
|
||||
[circleguard](https://github.com/circleguard/circleguard) is a better tool if you are looking for a more complete solution, as it has a GUI and supports .osr files.
|
||||
|
||||
this module was built with a narrow task in mind, and I do not have plans to implement more features (especially if circleguard already covers them)
|
||||
|
||||
# Usage
|
||||
|
||||
### Replay data class
|
||||
|
||||
`Replay` is the main data class you'll be throwing around. The only required field is the replay data (verbatim as fetched by the osu!api) in string format.
|
||||
|
||||
You can also pass additional parameters:
|
||||
|
||||
| parameter | type | required? | notes |
|
||||
|-----------|------|------------------------------|-------------------------------------------------------------------------------------------------------------|
|
||||
| id | Long | not for pairs, yes for sets* | used to find the replay in the output, does NOT have to match osu!api, it can be any identifier you'd like. |
|
||||
| mods | Int | no (defaults to NoMod) | exact value as fetched by the osu!api, it's used to flip the replay y-axis when HR is enabled. |
|
||||
|
||||
*You are forced to set the id when using the replay in a set comparison, as it is the identifier that will allow you to match the input to the results.
|
||||
|
||||
Example:
|
||||
|
||||
```kotlin
|
||||
// Simplest replay
|
||||
val replay: Replay = Replay(replayString)
|
||||
|
||||
// A NoMod replay with id 1
|
||||
val replay: Replay = Replay(replayString, id = 1, mods = 0)
|
||||
|
||||
// A HDHR (24) replay with id 2
|
||||
val replay: Replay = Replay(replayString, id = 2, mods = 24)
|
||||
```
|
||||
|
||||
### Replay pairs (2 replays)
|
||||
|
||||
The replay strings must be exactly as provided by the osu!api replay endpoint.
|
||||
|
||||
The following code calculates the similarity ratio and correlation ratio between two replays, without specifying any mods.
|
||||
|
||||
```kotlin
|
||||
// Compare using objects
|
||||
val replay1: Replay = Replay(replay1String)
|
||||
val replay2: Replay = Replay(replay2String)
|
||||
|
||||
val result: ReplayPairComparison = compareReplayPair(replay1, replay2)
|
||||
println(result.similarity) // 20.365197244184895
|
||||
println(result.correlation) // 0.9770151700235653
|
||||
|
||||
// You can also pass the replay data directly as strings
|
||||
val similarity: ReplayPairComparison = compareReplayPair(replay1String, replay2String)
|
||||
println(result.similarity) // 20.365197244184895
|
||||
println(result.correlation) // 0.9770151700235653
|
||||
```
|
||||
|
||||
### Replay sets (n replays)
|
||||
|
||||
If we decide to pass a list of replays, there will be optimizations such as multi-threading involved, which can speed up the calculations.
|
||||
|
||||
When comparing sets, you *must* set the replay id (it does not have to match the osu! replay id), as it is the identifier that will
|
||||
allow you to match the input to the results.
|
||||
|
||||
```kotlin
|
||||
// Compare using objects
|
||||
val replays: Array<Replay> = arrayOf(
|
||||
Replay("...", id = 1),
|
||||
Replay("...", id = 2)
|
||||
)
|
||||
|
||||
val result: List<ReplaySetComparison> = compareReplaySet(replays)
|
||||
println(result[0].replay1Id) // 1
|
||||
println(result[0].replay2Id) // 2
|
||||
println(result[0].similarity) // 155.20954003316618
|
||||
println(result[0].correlation) // 0.9859198745055805
|
||||
```
|
||||
|
||||
By default, the `compareReplaySet` method will default to using as many threads as there are cores on your system.
|
||||
You can change this behaviour by manually passing an amount of cores to use:
|
||||
|
||||
```kotlin
|
||||
compareReplaySet(replays, numThreads=4)
|
||||
```
|
||||
|
||||
# Benchmarks
|
||||
|
||||
### Performance
|
||||
|
||||
On my development machine (5900X), the following benchmarks were obtained.
|
||||
|
||||
I processed 10 batches of 100 replays each. The min/max/avg time refer to single batches.
|
||||
|
||||
| | version | min | max | avg | total | pairs/second |
|
||||
|-------------|-------------|------|------|------|-------|--------------|
|
||||
| | v20240211 | 3.1s | 4.2s | 3.3s | 32.7s | 1501/s |
|
||||
| | v20240211v2 | 2.5s | 3.7s | 2.7s | 26.7s | 1843/s |
|
||||
| **current** | v20240211v3 | 1.1s | 2.1s | 1.3s | 13.0s | 3789/s |
|
||||
|
||||
### Accuracy (compared to Circleguard)
|
||||
|
||||
>as of the last version, konata and circleguard give the same results, with a neglibile margin of error.
|
||||
|
||||
After selecting a random dataset of ~50,000 osu!std replays for different beatmaps, I compared the results from konata to circleguard, using the latter as the ground truth.
|
||||
|
||||
| metric | avg. delta | std. dev. | median | min | max |
|
||||
|---------------|------------|------------|-----------|-----------|-----------|
|
||||
| `SIMILARITY` | 0 | 0.000033 | 0 | -0.005373 | 0.007381 |
|
||||
| `CORRELATION` | -0.000643 | 0.001342 | -0.000433 | -0.041833 | 0.026300 |
|
||||
115
konata/src/main/kotlin/com/nisemoe/konata/CompareReplayPair.kt
Normal file
115
konata/src/main/kotlin/com/nisemoe/konata/CompareReplayPair.kt
Normal file
@ -0,0 +1,115 @@
|
||||
package com.nisemoe.konata
|
||||
|
||||
import com.nisemoe.konata.algorithms.calculateCorrelation
|
||||
import com.nisemoe.konata.algorithms.calculateDistance
|
||||
import org.jetbrains.bio.viktor.F64Array
|
||||
|
||||
|
||||
fun compareReplayPair(replay1: String, replay2: String): ReplayPairComparison {
|
||||
return compareReplayPair(Replay(replay1), Replay(replay2))
|
||||
}
|
||||
|
||||
fun isValidCoordinate(x: Double, y: Double): Boolean =
|
||||
x in 0.0..512.0 && y in 0.0..384.0
|
||||
|
||||
/**
|
||||
* This function creates a local vector for each replay, removing invalid coordinates and flipping the Y axis if necessary.
|
||||
* - We use copies of the original vectors to avoid modifying the original data and maintain integrity.
|
||||
* - The y-axis is flipped if the replay has HR enabled and the other doesn't, or vice versa.
|
||||
*/
|
||||
fun createLocalVectorPair(
|
||||
originalVectorA: F64Array,
|
||||
flipVectorA: Boolean,
|
||||
originalVectorB: F64Array,
|
||||
flipVectorB: Boolean
|
||||
): Pair<F64Array, F64Array> {
|
||||
require(originalVectorA.shape[0] == originalVectorB.shape[0]) { "Vectors must have the same shape." }
|
||||
|
||||
val size = originalVectorA.shape[0]
|
||||
val validIndexes = IntArray(size)
|
||||
var validCount = 0
|
||||
|
||||
for (i in 0 until size) {
|
||||
val xA = originalVectorA[i, 0]
|
||||
val yA = originalVectorA[i, 1]
|
||||
|
||||
val xB = originalVectorB[i, 0]
|
||||
val yB = originalVectorB[i, 1]
|
||||
|
||||
if (isValidCoordinate(xA, yA) && isValidCoordinate(xB, yB)) {
|
||||
validIndexes[validCount++] = i
|
||||
}
|
||||
}
|
||||
|
||||
val vectorAv2 = F64Array(validCount, 2)
|
||||
val vectorBv2 = F64Array(validCount, 2)
|
||||
|
||||
for (i in 0 until validCount) {
|
||||
val index = validIndexes[i]
|
||||
|
||||
vectorAv2[i, 0] = originalVectorA[index, 0]
|
||||
vectorAv2[i, 1] = if (flipVectorA) 384.0 - originalVectorA[index, 1] else originalVectorA[index, 1]
|
||||
|
||||
vectorBv2[i, 0] = originalVectorB[index, 0]
|
||||
vectorBv2[i, 1] = if (flipVectorB) 384.0 - originalVectorB[index, 1] else originalVectorB[index, 1]
|
||||
}
|
||||
|
||||
return Pair(vectorAv2, vectorBv2)
|
||||
}
|
||||
|
||||
fun compareReplayPair(replay1: Replay, replay2: Replay): ReplayPairComparison {
|
||||
val (longer, shorter) = arrangeReplaysByLength(replay1, replay2)
|
||||
val interpolatedShorterData = linearInterpolate(shorter, longer)
|
||||
|
||||
val (localVectorA, localVectorB) = createLocalVectorPair(
|
||||
interpolatedShorterData,
|
||||
shorter.hasHR() && !longer.hasHR(),
|
||||
longer.vector,
|
||||
!shorter.hasHR() && longer.hasHR()
|
||||
)
|
||||
|
||||
require(localVectorA.shape[0] == localVectorB.shape[0]) { "Datasets must have the same size." }
|
||||
return ReplayPairComparison(
|
||||
similarity = calculateDistance(localVectorA, localVectorB),
|
||||
correlation = calculateCorrelation(localVectorA, localVectorB)
|
||||
)
|
||||
}
|
||||
|
||||
private fun arrangeReplaysByLength(replay1: Replay, replay2: Replay): Pair<Replay, Replay> =
|
||||
if (replay1.vector.shape[0] > replay2.vector.shape[0]) replay1 to replay2 else replay2 to replay1
|
||||
|
||||
/**
|
||||
* Performs linear interpolation between two vectors of different sizes.
|
||||
* We assume that vectorA is the smaller one, and vectorB is the larger one.
|
||||
*
|
||||
* The (x,y) data in vectorA will be "stretched" to match the time points in vectorB.
|
||||
*
|
||||
* The length of the returned vector will be the same as vectorB.
|
||||
*
|
||||
* @param replayA The replay with fewer elements, used as the base for interpolation.
|
||||
* @param replayB The replay with more elements, where interpolation targets are found.
|
||||
* @return A vector with the same shape as vectorB, with the interpolated data.
|
||||
*/
|
||||
private fun linearInterpolate(replayA: Replay, replayB: Replay): F64Array {
|
||||
val returnVector = F64Array(replayB.vector.shape[0], 2)
|
||||
val lIndex = replayA.axis.lastIndex
|
||||
var maxLower = 0
|
||||
|
||||
for (indexB in 0..<replayB.vector.shape[0]) {
|
||||
val xi = replayB.vector[indexB, 2]
|
||||
val index = replayA.axis.binarySearch(xi, fromIndex = maxLower)
|
||||
val insertionPoint = if (index >= 0) index else -(index + 1)
|
||||
val lower = if (index >= 0) index else insertionPoint.coerceIn(1, lIndex + 1) - 1
|
||||
val upper = if (index >= 0) index else insertionPoint.coerceIn(0, lIndex)
|
||||
maxLower = lower
|
||||
if (lower == upper) {
|
||||
returnVector[indexB, 0] = replayA.vector[lower, 0]
|
||||
returnVector[indexB, 1] = replayA.vector[lower, 1]
|
||||
} else {
|
||||
val t = (xi - replayA.vector[lower, 2]) / (replayA.vector[upper, 2] - replayA.vector[lower, 2])
|
||||
returnVector[indexB, 0] = replayA.vector[lower, 0] + t * (replayA.vector[upper, 0] - replayA.vector[lower, 0])
|
||||
returnVector[indexB, 1] = replayA.vector[lower, 1] + t * (replayA.vector[upper, 1] - replayA.vector[lower, 1])
|
||||
}
|
||||
}
|
||||
return returnVector
|
||||
}
|
||||
@ -0,0 +1,58 @@
|
||||
package com.nisemoe.konata
|
||||
|
||||
import kotlinx.coroutines.asCoroutineDispatcher
|
||||
import kotlinx.coroutines.coroutineScope
|
||||
import kotlinx.coroutines.launch
|
||||
import kotlinx.coroutines.runBlocking
|
||||
import java.util.concurrent.Executors
|
||||
|
||||
fun <T> Array<T>.combinations(): Sequence<Pair<T, T>> = sequence {
|
||||
for (i in indices) {
|
||||
for (j in i + 1 until this@combinations.size) {
|
||||
yield(this@combinations[i] to this@combinations[j])
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fun removeDuplicateReplays(replaySet: Array<Replay>): Array<Replay> {
|
||||
val uniqueReplays = replaySet.toList().distinctBy { it.id }
|
||||
return uniqueReplays.toTypedArray()
|
||||
}
|
||||
|
||||
fun compareReplaySet(replaySet: Array<Replay>,
|
||||
numThreads: Int = Runtime.getRuntime().availableProcessors(),
|
||||
): List<ReplaySetComparison> = runBlocking {
|
||||
if(replaySet.any { it.id == null })
|
||||
throw IllegalArgumentException("All replays must have an ID when calling compareReplaySet!")
|
||||
|
||||
val replaySetUnique = removeDuplicateReplays(replaySet)
|
||||
|
||||
val dispatcher = Executors
|
||||
.newFixedThreadPool(numThreads)
|
||||
.asCoroutineDispatcher()
|
||||
|
||||
val result = mutableListOf<ReplaySetComparison>()
|
||||
|
||||
coroutineScope {
|
||||
replaySetUnique.combinations().forEach { (replay1, replay2) ->
|
||||
launch(dispatcher) {
|
||||
val comparisonResult = compareReplayPair(replay1, replay2)
|
||||
result.add(
|
||||
ReplaySetComparison(
|
||||
replay1Id = replay1.id!!,
|
||||
replay1Mods = replay1.mods,
|
||||
|
||||
replay2Id = replay2.id!!,
|
||||
replay2Mods = replay2.mods,
|
||||
|
||||
similarity = comparisonResult.similarity,
|
||||
correlation = comparisonResult.correlation
|
||||
)
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
dispatcher.close()
|
||||
return@runBlocking result
|
||||
}
|
||||
30
konata/src/main/kotlin/com/nisemoe/konata/Replay.kt
Normal file
30
konata/src/main/kotlin/com/nisemoe/konata/Replay.kt
Normal file
@ -0,0 +1,30 @@
|
||||
package com.nisemoe.konata
|
||||
|
||||
import com.nisemoe.konata.tools.getEvents
|
||||
import com.nisemoe.konata.tools.processReplayData
|
||||
import org.jetbrains.bio.viktor.F64Array
|
||||
|
||||
class Replay(string: String, id: Long? = null, mods: Int = 0) {
|
||||
|
||||
private var events: ArrayList<ReplayEvent>
|
||||
|
||||
var vector: F64Array
|
||||
var axis: DoubleArray
|
||||
var id: Long? = null
|
||||
var mods: Int = 0
|
||||
|
||||
init {
|
||||
this.id = id
|
||||
this.mods = mods
|
||||
|
||||
this.events = getEvents(string)
|
||||
this.vector = processReplayData(this.events)
|
||||
this.axis = this.vector.view(2, axis = 1).toDoubleArray()
|
||||
}
|
||||
|
||||
fun hasHR(): Boolean {
|
||||
return (mods and (1 shl 4)) == 1 shl 4
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
23
konata/src/main/kotlin/com/nisemoe/konata/ReplayDto.kt
Normal file
23
konata/src/main/kotlin/com/nisemoe/konata/ReplayDto.kt
Normal file
@ -0,0 +1,23 @@
|
||||
package com.nisemoe.konata
|
||||
|
||||
data class ReplayPairComparison(
|
||||
val similarity: Double,
|
||||
val correlation: Double
|
||||
)
|
||||
|
||||
data class ReplaySetComparison(
|
||||
val replay1Id: Long,
|
||||
val replay1Mods: Int,
|
||||
|
||||
val replay2Id: Long,
|
||||
val replay2Mods: Int,
|
||||
|
||||
val similarity: Double,
|
||||
val correlation: Double
|
||||
)
|
||||
|
||||
data class ReplayEvent(
|
||||
val timeDelta: Int,
|
||||
val x: Double,
|
||||
val y: Double
|
||||
)
|
||||
@ -0,0 +1,85 @@
|
||||
package com.nisemoe.konata.algorithms
|
||||
|
||||
import kotlinx.coroutines.*
|
||||
import org.apache.commons.math3.stat.descriptive.rank.Median
|
||||
import org.apache.commons.math3.transform.DftNormalization
|
||||
import org.apache.commons.math3.transform.FastFourierTransformer.transformInPlace
|
||||
import org.apache.commons.math3.transform.TransformType
|
||||
import org.jetbrains.bio.viktor.F64Array
|
||||
import kotlin.math.ceil
|
||||
import kotlin.math.log2
|
||||
import kotlin.math.pow
|
||||
|
||||
/**
|
||||
* Calculates the cross-correlation of two 1d arrays.
|
||||
* Parameters:
|
||||
* - arrayA: The first input array.
|
||||
* - arrayB: The second input array.
|
||||
* Returns:
|
||||
* - Discrete linear cross-correlation of arrayA and arrayB.
|
||||
*/
|
||||
fun getCrossCorrelation(arrayA: F64Array, arrayB: F64Array): F64Array {
|
||||
val n = arrayA.shape[0] + arrayB.shape[0] - 1
|
||||
val size = 2.0.pow(ceil(log2(n.toDouble()))).toInt()
|
||||
|
||||
val transformArrayA = Array(2) { DoubleArray(size) }
|
||||
val transformArrayB = Array(2) { DoubleArray(size) }
|
||||
|
||||
for (i in 0 until size) {
|
||||
if (i < arrayA.shape[0]) {
|
||||
transformArrayA[0][i] = arrayA[i, 0]
|
||||
transformArrayA[1][i] = arrayA[i, 1]
|
||||
}
|
||||
if (i < arrayB.shape[0]) {
|
||||
transformArrayB[0][i] = arrayB[arrayB.shape[0] - i - 1, 0]
|
||||
transformArrayB[1][i] = -arrayB[arrayB.shape[0] - i - 1, 1]
|
||||
}
|
||||
}
|
||||
|
||||
transformInPlace(transformArrayA, DftNormalization.STANDARD, TransformType.FORWARD)
|
||||
transformInPlace(transformArrayB, DftNormalization.STANDARD, TransformType.FORWARD)
|
||||
|
||||
for (i in 0 until size) {
|
||||
val a = transformArrayA[0][i]
|
||||
val b = transformArrayA[1][i]
|
||||
val c = transformArrayB[0][i]
|
||||
val d = transformArrayB[1][i]
|
||||
|
||||
transformArrayA[0][i] = a * c - b * d
|
||||
transformArrayA[1][i] = a * d + b * c
|
||||
}
|
||||
|
||||
transformInPlace(transformArrayA, DftNormalization.STANDARD, TransformType.INVERSE)
|
||||
|
||||
return F64Array(n) { index ->
|
||||
if (index < size) transformArrayA[0][index] else 0.0
|
||||
}
|
||||
}
|
||||
|
||||
fun calculateCorrelation(vectorA: F64Array, vectorB: F64Array): Double = runBlocking {
|
||||
val correlations = mutableListOf<Deferred<Double>>()
|
||||
val i1 = vectorA.shape[0] / 5
|
||||
|
||||
for (i in 0 until 5) {
|
||||
val startIdx = i * i1
|
||||
val endIdx = if (i < 4) startIdx + i1 else vectorA.shape[0]
|
||||
|
||||
correlations.add(async(Dispatchers.Default) {
|
||||
val vectorChunkA = vectorA.slice(startIdx, endIdx).copy()
|
||||
val vectorChunkB = vectorB.slice(startIdx, endIdx).copy()
|
||||
|
||||
vectorChunkA -= vectorChunkA.mean()
|
||||
vectorChunkB -= vectorChunkB.mean()
|
||||
|
||||
val norm = vectorChunkA.flatten().sd() * vectorChunkB.flatten().sd() * (vectorChunkA.shape[0] * 2)
|
||||
val correlation = getCrossCorrelation(
|
||||
vectorChunkA,
|
||||
vectorChunkB
|
||||
)
|
||||
correlation /= norm
|
||||
correlation.max()
|
||||
})
|
||||
}
|
||||
|
||||
Median().evaluate(correlations.awaitAll().toDoubleArray())
|
||||
}
|
||||
@ -0,0 +1,29 @@
|
||||
package com.nisemoe.konata.algorithms
|
||||
|
||||
import org.jetbrains.bio.viktor.F64Array
|
||||
import org.jetbrains.bio.viktor._I
|
||||
|
||||
/**
|
||||
* Calculates the Euclidean distance between two vectors. The vectors *have* to be the same shape.
|
||||
*
|
||||
* It can look weird because the F64Array library (viktor) doesn't support operations such as .pow()
|
||||
* so I've had to work with what we had.
|
||||
*/
|
||||
fun calculateDistance(vectorA: F64Array, vectorB: F64Array): Double {
|
||||
require(vectorA.shape[0] == vectorB.shape[0]) { "Vectors must have the same shape." }
|
||||
|
||||
val difference = vectorA - vectorB
|
||||
difference.timesAssign(difference)
|
||||
|
||||
val intermediateResults = F64Array(vectorB.shape[0])
|
||||
|
||||
val sumOfSquaresX = difference.V[_I, 0]
|
||||
val sumOfSquaresY = difference.V[_I, 1]
|
||||
intermediateResults += (sumOfSquaresX + sumOfSquaresY)
|
||||
|
||||
intermediateResults.logInPlace()
|
||||
intermediateResults *= 0.5
|
||||
intermediateResults.expInPlace()
|
||||
|
||||
return intermediateResults.mean()
|
||||
}
|
||||
@ -0,0 +1,38 @@
|
||||
package com.nisemoe.konata.tools
|
||||
|
||||
import com.nisemoe.konata.ReplayEvent
|
||||
import org.apache.commons.compress.compressors.lzma.LZMACompressorInputStream
|
||||
import java.util.*
|
||||
import kotlin.collections.ArrayList
|
||||
|
||||
fun getEvents(replayString: String): ArrayList<ReplayEvent> {
|
||||
val decompressedData = decompressData(replayString)
|
||||
val replayDataStr = String(decompressedData, Charsets.UTF_8).trimEnd(',')
|
||||
return processEvents(replayDataStr)
|
||||
}
|
||||
|
||||
private fun decompressData(replayString: String): ByteArray =
|
||||
Base64.getDecoder().decode(replayString).inputStream().use { byteStream ->
|
||||
LZMACompressorInputStream(byteStream).readBytes()
|
||||
}
|
||||
|
||||
internal fun processEvents(replayDataStr: String): ArrayList<ReplayEvent> {
|
||||
val eventStrings = replayDataStr.split(",")
|
||||
val playData = ArrayList<ReplayEvent>(eventStrings.size)
|
||||
eventStrings.forEachIndexed { index, eventStr ->
|
||||
val event = createReplayEvent(index, eventStr.split('|'), eventStrings.size)
|
||||
event?.let { playData.add(it) }
|
||||
}
|
||||
return playData
|
||||
}
|
||||
|
||||
private fun createReplayEvent(index: Int, event: List<String>, totalEvents: Int): ReplayEvent? {
|
||||
val timeDelta = event[0].toInt()
|
||||
val x = event[1].toDouble()
|
||||
val y = event[2].toDouble()
|
||||
|
||||
if (timeDelta == -12345 && index == totalEvents - 1) return null
|
||||
// if (index < 2 && x == 256.0 && y == -500.0) return null
|
||||
|
||||
return ReplayEvent(timeDelta, x, y)
|
||||
}
|
||||
@ -0,0 +1,62 @@
|
||||
package com.nisemoe.konata.tools
|
||||
|
||||
import com.nisemoe.konata.ReplayEvent
|
||||
import org.jetbrains.bio.viktor.F64Array
|
||||
|
||||
fun processReplayData(events: ArrayList<ReplayEvent>): F64Array {
|
||||
if (events.isEmpty()) throw IllegalArgumentException("This replay's replay data was empty. It indicates a misbehaved replay.")
|
||||
|
||||
if (events.first().timeDelta == 0 && events.size > 1) events.removeAt(0)
|
||||
|
||||
val pEvents = ArrayList<Triple<Double, Double, Double>>()
|
||||
|
||||
var cumulativeTimeDelta = events.first().timeDelta
|
||||
var highestTimeDelta = Double.NEGATIVE_INFINITY
|
||||
var lastPositiveFrame: ReplayEvent? = null
|
||||
|
||||
var wasInNegativeSection = false
|
||||
val lastPositiveFrameData = mutableListOf<Pair<Double, Pair<Double, Double>>>()
|
||||
|
||||
events.drop(1).forEachIndexed { index, currentFrame ->
|
||||
val previousCumulativeTime = cumulativeTimeDelta
|
||||
cumulativeTimeDelta += currentFrame.timeDelta
|
||||
|
||||
highestTimeDelta = maxOf(highestTimeDelta, cumulativeTimeDelta.toDouble())
|
||||
|
||||
val isInNegativeSection = cumulativeTimeDelta < highestTimeDelta
|
||||
if (isInNegativeSection) {
|
||||
if (!wasInNegativeSection) {
|
||||
lastPositiveFrame = if (index > 0) events[index - 1] else null
|
||||
}
|
||||
} else {
|
||||
if (wasInNegativeSection && lastPositiveFrame != null) {
|
||||
val lastPositiveTime = lastPositiveFrameData.lastOrNull()?.first ?: previousCumulativeTime.toDouble()
|
||||
val ratio = (lastPositiveTime - previousCumulativeTime) / (cumulativeTimeDelta - previousCumulativeTime).toDouble()
|
||||
|
||||
val interpolatedX = lastPositiveFrame!!.x + ratio * (currentFrame.x - lastPositiveFrame!!.x)
|
||||
val interpolatedY = lastPositiveFrame!!.y + ratio * (currentFrame.y - lastPositiveFrame!!.y)
|
||||
|
||||
pEvents.add(Triple(interpolatedX, interpolatedY, lastPositiveTime))
|
||||
}
|
||||
wasInNegativeSection = false
|
||||
}
|
||||
|
||||
wasInNegativeSection = isInNegativeSection
|
||||
|
||||
if (!isInNegativeSection)
|
||||
pEvents.add(Triple(currentFrame.x, currentFrame.y, cumulativeTimeDelta.toDouble()))
|
||||
|
||||
if (!isInNegativeSection)
|
||||
lastPositiveFrameData.add(Pair(cumulativeTimeDelta.toDouble(), Pair(currentFrame.x, currentFrame.y)))
|
||||
}
|
||||
|
||||
val pEventsUnique = pEvents.distinctBy { it.third }
|
||||
|
||||
return F64Array(pEventsUnique.size, 3) { index, dim ->
|
||||
when (dim) {
|
||||
0 -> pEventsUnique[index].first
|
||||
1 -> pEventsUnique[index].second
|
||||
else -> pEventsUnique[index].third
|
||||
}
|
||||
}
|
||||
}
|
||||
25
konata/src/test/kotlin/com/nisemoe/konata/CorrelationTest.kt
Normal file
25
konata/src/test/kotlin/com/nisemoe/konata/CorrelationTest.kt
Normal file
File diff suppressed because one or more lines are too long
88
konata/src/test/kotlin/com/nisemoe/konata/HardRockTest.kt
Normal file
88
konata/src/test/kotlin/com/nisemoe/konata/HardRockTest.kt
Normal file
File diff suppressed because one or more lines are too long
133
konata/src/test/kotlin/com/nisemoe/konata/ReplayTest.kt
Normal file
133
konata/src/test/kotlin/com/nisemoe/konata/ReplayTest.kt
Normal file
File diff suppressed because one or more lines are too long
30
konata/src/test/resources/replays.txt
Normal file
30
konata/src/test/resources/replays.txt
Normal file
File diff suppressed because one or more lines are too long
100
konata/src/test/resources/replays_1.txt
Normal file
100
konata/src/test/resources/replays_1.txt
Normal file
File diff suppressed because one or more lines are too long
69
nise-backend/Build.sh
Executable file
69
nise-backend/Build.sh
Executable file
@ -0,0 +1,69 @@
|
||||
#!/bin/bash
|
||||
|
||||
source /home/anon/.jabba/jabba.sh
|
||||
|
||||
# amazon-corretto
|
||||
jabba use 21.0.2
|
||||
|
||||
# Set variables
|
||||
IMAGE_NAME="nise-backend"
|
||||
IMAGE_REGISTRY="git.gengo.tech/gengotech"
|
||||
IMAGE_VERSION=$(grep -m2 "<version>" pom.xml | tail -n1 | sed 's/[[:space:]]*<version>//;s/<\/version>//')
|
||||
|
||||
# Check if there are uncommitted changes
|
||||
if [[ -n $(git status --porcelain) ]]; then
|
||||
git add .
|
||||
git commit -m "Build and push v$IMAGE_VERSION"
|
||||
fi
|
||||
|
||||
# List variables and prompt to continue
|
||||
echo "Variables:"
|
||||
echo "IMAGE_NAME=$IMAGE_NAME"
|
||||
echo "IMAGE_REGISTRY=$IMAGE_REGISTRY"
|
||||
echo "IMAGE_VERSION=$IMAGE_VERSION"
|
||||
|
||||
rm -rf target/
|
||||
|
||||
# Clean and build Maven project
|
||||
mvn clean package
|
||||
if [ $? -ne 0 ]; then
|
||||
echo "Maven build failed"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Build and push Docker image
|
||||
docker build . -t $IMAGE_NAME:$IMAGE_VERSION
|
||||
if [ $? -ne 0 ]; then
|
||||
echo "Docker build failed"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
docker tag $IMAGE_NAME:$IMAGE_VERSION $IMAGE_REGISTRY/$IMAGE_NAME:$IMAGE_VERSION
|
||||
docker push $IMAGE_REGISTRY/$IMAGE_NAME:$IMAGE_VERSION
|
||||
if [ "$?" != "0" ]; then
|
||||
echo "Error: Failed to push $IMAGE_REGISTRY/$IMAGE_NAME:$IMAGE_VERSION"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
docker tag $IMAGE_NAME:$IMAGE_VERSION $IMAGE_REGISTRY/$IMAGE_NAME:latest
|
||||
docker push $IMAGE_REGISTRY/$IMAGE_NAME:latest
|
||||
if [ "$?" != "0" ]; then
|
||||
echo "Error: Failed to push $IMAGE_REGISTRY/$IMAGE_NAME:latest"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Successfully built and pushed $IMAGE_REGISTRY/$IMAGE_NAME:$IMAGE_VERSION and $IMAGE_REGISTRY/$IMAGE_NAME:latest"
|
||||
echo "Docker image pushed successfully to $IMAGE_REGISTRY/$IMAGE_NAME:$IMAGE_VERSION"
|
||||
|
||||
# Add and commit changes to git
|
||||
git add .
|
||||
git commit -m "Build and push v$IMAGE_VERSION"
|
||||
|
||||
# Create annotated tag with commit message
|
||||
git tag -a release-$IMAGE_VERSION -m "Release v$IMAGE_VERSION"
|
||||
|
||||
# Push changes and tags to remote repository
|
||||
git push && git push --tags
|
||||
|
||||
echo "Successfully pushed changes and tags to remote repository"
|
||||
|
||||
17
nise-backend/Dockerfile
Normal file
17
nise-backend/Dockerfile
Normal file
@ -0,0 +1,17 @@
|
||||
FROM amazoncorretto:21.0.2 AS builder
|
||||
|
||||
WORKDIR application
|
||||
|
||||
ARG JAR_FILE=target/*.jar
|
||||
COPY ${JAR_FILE} /application/
|
||||
RUN mv *.jar application.jar
|
||||
RUN java -Djarmode=layertools -jar application.jar extract
|
||||
|
||||
FROM amazoncorretto:21.0.2
|
||||
|
||||
COPY --from=builder application/spring-boot-loader/ ./
|
||||
COPY --from=builder application/dependencies/ ./
|
||||
COPY --from=builder application/snapshot-dependencies/ ./
|
||||
COPY --from=builder application/application/ ./
|
||||
|
||||
ENTRYPOINT ["java", "-Xms512m", "-Xmx8g", "org.springframework.boot.loader.launch.JarLauncher"]
|
||||
219
nise-backend/pom.xml
Normal file
219
nise-backend/pom.xml
Normal file
@ -0,0 +1,219 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
|
||||
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
|
||||
<modelVersion>4.0.0</modelVersion>
|
||||
|
||||
<parent>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-parent</artifactId>
|
||||
<version>3.2.2</version>
|
||||
<relativePath/>
|
||||
</parent>
|
||||
|
||||
<groupId>com.nisemoe</groupId>
|
||||
<artifactId>nise-backend</artifactId>
|
||||
<version>0.0.1-SNAPSHOT</version>
|
||||
<name>nise</name>
|
||||
<description>nise.moe api</description>
|
||||
|
||||
<properties>
|
||||
<java.version>21</java.version>
|
||||
<testcontainers.version>1.17.6</testcontainers.version>
|
||||
<kotlin.version>1.9.22</kotlin.version>
|
||||
</properties>
|
||||
|
||||
<dependencies>
|
||||
<!-- Test containers -->
|
||||
<dependency>
|
||||
<groupId>org.testcontainers</groupId>
|
||||
<artifactId>testcontainers</artifactId>
|
||||
<version>${testcontainers.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.testcontainers</groupId>
|
||||
<artifactId>postgresql</artifactId>
|
||||
<version>${testcontainers.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.testcontainers</groupId>
|
||||
<artifactId>junit-jupiter</artifactId>
|
||||
<version>${testcontainers.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.nisemoe</groupId>
|
||||
<artifactId>konata</artifactId>
|
||||
<version>0.0.1-SNAPSHOT</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.fasterxml.jackson.dataformat</groupId>
|
||||
<artifactId>jackson-dataformat-xml</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.fasterxml.jackson.datatype</groupId>
|
||||
<artifactId>jackson-datatype-jsr310</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.postgresql</groupId>
|
||||
<artifactId>postgresql</artifactId>
|
||||
<scope>runtime</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-websocket</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.flywaydb</groupId>
|
||||
<artifactId>flyway-core</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-jooq</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-web</artifactId>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-data-redis</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.springframework.session</groupId>
|
||||
<artifactId>spring-session-data-redis</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-validation</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.fasterxml.jackson.module</groupId>
|
||||
<artifactId>jackson-module-kotlin</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.jetbrains.kotlinx</groupId>
|
||||
<artifactId>kotlinx-serialization-json</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.jetbrains.kotlinx</groupId>
|
||||
<artifactId>kotlinx-coroutines-core</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.jetbrains.kotlin</groupId>
|
||||
<artifactId>kotlin-reflect</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.jetbrains.kotlin</groupId>
|
||||
<artifactId>kotlin-stdlib</artifactId>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-test</artifactId>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
|
||||
<build>
|
||||
<sourceDirectory>${project.basedir}/src/main/kotlin</sourceDirectory>
|
||||
<testSourceDirectory>${project.basedir}/src/test/kotlin</testSourceDirectory>
|
||||
<plugins>
|
||||
<plugin>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-maven-plugin</artifactId>
|
||||
</plugin>
|
||||
<plugin>
|
||||
<groupId>org.jetbrains.kotlin</groupId>
|
||||
<artifactId>kotlin-maven-plugin</artifactId>
|
||||
<configuration>
|
||||
<args>
|
||||
<arg>-Xjsr305=strict</arg>
|
||||
</args>
|
||||
<compilerPlugins>
|
||||
<plugin>spring</plugin>
|
||||
<plugin>kotlinx-serialization</plugin>
|
||||
<plugin>no-arg</plugin>
|
||||
</compilerPlugins>
|
||||
<pluginOptions>
|
||||
<option>no-arg:annotation=com.nisemoe.nise.AllowCacheSerialization</option>
|
||||
</pluginOptions>
|
||||
</configuration>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>org.jetbrains.kotlin</groupId>
|
||||
<artifactId>kotlin-maven-noarg</artifactId>
|
||||
<version>${kotlin.version}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.jetbrains.kotlin</groupId>
|
||||
<artifactId>kotlin-maven-allopen</artifactId>
|
||||
<version>${kotlin.version}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.jetbrains.kotlin</groupId>
|
||||
<artifactId>kotlin-maven-serialization</artifactId>
|
||||
<version>${kotlin.version}</version>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
</plugin>
|
||||
<!-- Jooq -->
|
||||
<plugin>
|
||||
<groupId>org.jooq</groupId>
|
||||
<artifactId>jooq-codegen-maven</artifactId>
|
||||
<executions>
|
||||
<execution>
|
||||
<id>generate-jooq-sources</id>
|
||||
<phase>generate-sources</phase>
|
||||
<goals>
|
||||
<goal>generate</goal>
|
||||
</goals>
|
||||
</execution>
|
||||
</executions>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>org.postgresql</groupId>
|
||||
<artifactId>postgresql</artifactId>
|
||||
<version>42.5.4</version>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
<configuration>
|
||||
<jdbc>
|
||||
<driver>org.postgresql.Driver</driver>
|
||||
<url>jdbc:postgresql://nise-postgres.local:5432/postgres</url>
|
||||
<user>postgres</user>
|
||||
<password>lol123</password>
|
||||
</jdbc>
|
||||
<generator>
|
||||
<name>org.jooq.codegen.KotlinGenerator</name>
|
||||
<generate>
|
||||
<kotlinNotNullPojoAttributes>true</kotlinNotNullPojoAttributes>
|
||||
<kotlinNotNullRecordAttributes>true</kotlinNotNullRecordAttributes>
|
||||
<kotlinNotNullInterfaceAttributes>true</kotlinNotNullInterfaceAttributes>
|
||||
</generate>
|
||||
<database>
|
||||
<name>org.jooq.meta.postgres.PostgresDatabase</name>
|
||||
<schemata>
|
||||
<schema>
|
||||
<inputSchema>public</inputSchema>
|
||||
</schema>
|
||||
</schemata>
|
||||
</database>
|
||||
<target>
|
||||
<packageName>com.nisemoe.generated</packageName>
|
||||
<directory>src/main/kotlin</directory>
|
||||
</target>
|
||||
</generator>
|
||||
</configuration>
|
||||
</plugin>
|
||||
</plugins>
|
||||
</build>
|
||||
|
||||
</project>
|
||||
24
nise-backend/readme.md
Normal file
24
nise-backend/readme.md
Normal file
@ -0,0 +1,24 @@
|
||||
# environment
|
||||
|
||||
Available Spring profiles (`SPRING_PROFILES_ACTIVE`):
|
||||
|
||||
- `postgres`: Connects to postgres (required)
|
||||
- `old_scores`: Tries to fix old scores with a version of the algorithm that was used before the current one. (optional)
|
||||
- `updater`: Pulls new scores from the osu!api (optional)
|
||||
- `discord`: Enables integration with Discord webhooks (optional)
|
||||
- `debug`: Enables debug logs in console (optional)
|
||||
|
||||
To run, you'll need a local postgres database and a redis database.
|
||||
|
||||
You can check out the configuration files in `src/main/resources` to see how to configure the application.
|
||||
|
||||
|
||||
# development
|
||||
|
||||
Make sure you have the correct JDK (21) and Kotlin versions installed to run it.
|
||||
|
||||
If you need to rebuild JooQ entities, you can run the following command:
|
||||
|
||||
```
|
||||
mvn generate-sources
|
||||
```
|
||||
@ -0,0 +1,43 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated
|
||||
|
||||
|
||||
import kotlin.collections.List
|
||||
|
||||
import org.jooq.Constants
|
||||
import org.jooq.Schema
|
||||
import org.jooq.impl.CatalogImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class DefaultCatalog : CatalogImpl("") {
|
||||
companion object {
|
||||
|
||||
/**
|
||||
* The reference instance of <code>DEFAULT_CATALOG</code>
|
||||
*/
|
||||
public val DEFAULT_CATALOG: DefaultCatalog = DefaultCatalog()
|
||||
}
|
||||
|
||||
/**
|
||||
* The schema <code>public</code>.
|
||||
*/
|
||||
val PUBLIC: Public get(): Public = Public.PUBLIC
|
||||
|
||||
override fun getSchemas(): List<Schema> = listOf(
|
||||
Public.PUBLIC
|
||||
)
|
||||
|
||||
/**
|
||||
* A reference to the 3.18 minor release of the code generator. If this
|
||||
* doesn't compile, it's because the runtime library uses an older minor
|
||||
* release, namely: 3.18. You can turn off the generation of this reference
|
||||
* by specifying /configuration/generator/generate/jooqVersionReference
|
||||
*/
|
||||
private val REQUIRE_RUNTIME_JOOQ_VERSION = Constants.VERSION_3_18
|
||||
}
|
||||
112
nise-backend/src/main/kotlin/com/nisemoe/generated/Public.kt
Normal file
112
nise-backend/src/main/kotlin/com/nisemoe/generated/Public.kt
Normal file
@ -0,0 +1,112 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated
|
||||
|
||||
|
||||
import com.nisemoe.generated.sequences.BEATMAPS_BEATMAP_ID_SEQ
|
||||
import com.nisemoe.generated.sequences.BEATMAPS_BEATMAP_ID_SEQ1
|
||||
import com.nisemoe.generated.sequences.SCORES_ID_SEQ
|
||||
import com.nisemoe.generated.sequences.SCORES_ID_SEQ1
|
||||
import com.nisemoe.generated.sequences.SCORES_JUDGEMENTS_ID_SEQ
|
||||
import com.nisemoe.generated.sequences.SCORES_JUDGEMENTS_ID_SEQ1
|
||||
import com.nisemoe.generated.sequences.SCORES_SIMILARITY_ID_SEQ
|
||||
import com.nisemoe.generated.sequences.SCORES_SIMILARITY_ID_SEQ1
|
||||
import com.nisemoe.generated.sequences.USERS_USER_ID_SEQ
|
||||
import com.nisemoe.generated.sequences.USERS_USER_ID_SEQ1
|
||||
import com.nisemoe.generated.tables.Beatmaps
|
||||
import com.nisemoe.generated.tables.FlywaySchemaHistory
|
||||
import com.nisemoe.generated.tables.RedditPost
|
||||
import com.nisemoe.generated.tables.Scores
|
||||
import com.nisemoe.generated.tables.ScoresJudgements
|
||||
import com.nisemoe.generated.tables.ScoresSimilarity
|
||||
import com.nisemoe.generated.tables.UpdateUserQueue
|
||||
import com.nisemoe.generated.tables.Users
|
||||
|
||||
import kotlin.collections.List
|
||||
|
||||
import org.jooq.Catalog
|
||||
import org.jooq.Sequence
|
||||
import org.jooq.Table
|
||||
import org.jooq.impl.SchemaImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class Public : SchemaImpl("public", DefaultCatalog.DEFAULT_CATALOG) {
|
||||
public companion object {
|
||||
|
||||
/**
|
||||
* The reference instance of <code>public</code>
|
||||
*/
|
||||
val PUBLIC: Public = Public()
|
||||
}
|
||||
|
||||
/**
|
||||
* The table <code>public.beatmaps</code>.
|
||||
*/
|
||||
val BEATMAPS: Beatmaps get() = Beatmaps.BEATMAPS
|
||||
|
||||
/**
|
||||
* The table <code>public.flyway_schema_history</code>.
|
||||
*/
|
||||
val FLYWAY_SCHEMA_HISTORY: FlywaySchemaHistory get() = FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY
|
||||
|
||||
/**
|
||||
* The table <code>public.reddit_post</code>.
|
||||
*/
|
||||
val REDDIT_POST: RedditPost get() = RedditPost.REDDIT_POST
|
||||
|
||||
/**
|
||||
* The table <code>public.scores</code>.
|
||||
*/
|
||||
val SCORES: Scores get() = Scores.SCORES
|
||||
|
||||
/**
|
||||
* The table <code>public.scores_judgements</code>.
|
||||
*/
|
||||
val SCORES_JUDGEMENTS: ScoresJudgements get() = ScoresJudgements.SCORES_JUDGEMENTS
|
||||
|
||||
/**
|
||||
* The table <code>public.scores_similarity</code>.
|
||||
*/
|
||||
val SCORES_SIMILARITY: ScoresSimilarity get() = ScoresSimilarity.SCORES_SIMILARITY
|
||||
|
||||
/**
|
||||
* The table <code>public.update_user_queue</code>.
|
||||
*/
|
||||
val UPDATE_USER_QUEUE: UpdateUserQueue get() = UpdateUserQueue.UPDATE_USER_QUEUE
|
||||
|
||||
/**
|
||||
* The table <code>public.users</code>.
|
||||
*/
|
||||
val USERS: Users get() = Users.USERS
|
||||
|
||||
override fun getCatalog(): Catalog = DefaultCatalog.DEFAULT_CATALOG
|
||||
|
||||
override fun getSequences(): List<Sequence<*>> = listOf(
|
||||
BEATMAPS_BEATMAP_ID_SEQ,
|
||||
BEATMAPS_BEATMAP_ID_SEQ1,
|
||||
SCORES_ID_SEQ,
|
||||
SCORES_ID_SEQ1,
|
||||
SCORES_JUDGEMENTS_ID_SEQ,
|
||||
SCORES_JUDGEMENTS_ID_SEQ1,
|
||||
SCORES_SIMILARITY_ID_SEQ,
|
||||
SCORES_SIMILARITY_ID_SEQ1,
|
||||
USERS_USER_ID_SEQ,
|
||||
USERS_USER_ID_SEQ1
|
||||
)
|
||||
|
||||
override fun getTables(): List<Table<*>> = listOf(
|
||||
Beatmaps.BEATMAPS,
|
||||
FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY,
|
||||
RedditPost.REDDIT_POST,
|
||||
Scores.SCORES,
|
||||
ScoresJudgements.SCORES_JUDGEMENTS,
|
||||
ScoresSimilarity.SCORES_SIMILARITY,
|
||||
UpdateUserQueue.UPDATE_USER_QUEUE,
|
||||
Users.USERS
|
||||
)
|
||||
}
|
||||
@ -0,0 +1,27 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.enums
|
||||
|
||||
|
||||
import com.nisemoe.generated.Public
|
||||
|
||||
import org.jooq.Catalog
|
||||
import org.jooq.EnumType
|
||||
import org.jooq.Schema
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
enum class JudgementType(@get:JvmName("literal") public val literal: String) : EnumType {
|
||||
`300`("300"),
|
||||
`100`("100"),
|
||||
`50`("50"),
|
||||
Miss("Miss");
|
||||
override fun getCatalog(): Catalog? = schema.catalog
|
||||
override fun getSchema(): Schema = Public.PUBLIC
|
||||
override fun getName(): String = "judgement_type"
|
||||
override fun getLiteral(): String = literal
|
||||
}
|
||||
@ -0,0 +1,31 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.indexes
|
||||
|
||||
|
||||
import com.nisemoe.generated.tables.FlywaySchemaHistory
|
||||
import com.nisemoe.generated.tables.Scores
|
||||
import com.nisemoe.generated.tables.ScoresJudgements
|
||||
import com.nisemoe.generated.tables.ScoresSimilarity
|
||||
|
||||
import org.jooq.Index
|
||||
import org.jooq.impl.DSL
|
||||
import org.jooq.impl.Internal
|
||||
|
||||
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// INDEX definitions
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
val FLYWAY_SCHEMA_HISTORY_S_IDX: Index = Internal.createIndex(DSL.name("flyway_schema_history_s_idx"), FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY, arrayOf(FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY.SUCCESS), false)
|
||||
val IDX_REPLAY_IDS: Index = Internal.createIndex(DSL.name("idx_replay_ids"), ScoresSimilarity.SCORES_SIMILARITY, arrayOf(ScoresSimilarity.SCORES_SIMILARITY.REPLAY_ID_1, ScoresSimilarity.SCORES_SIMILARITY.REPLAY_ID_2), false)
|
||||
val IDX_REPLAY_IDS_PAIRS: Index = Internal.createIndex(DSL.name("idx_replay_ids_pairs"), ScoresSimilarity.SCORES_SIMILARITY, arrayOf(ScoresSimilarity.SCORES_SIMILARITY.REPLAY_ID_1, ScoresSimilarity.SCORES_SIMILARITY.REPLAY_ID_2), false)
|
||||
val IDX_SCORES_BEATMAP_ID: Index = Internal.createIndex(DSL.name("idx_scores_beatmap_id"), Scores.SCORES, arrayOf(Scores.SCORES.BEATMAP_ID), false)
|
||||
val IDX_SCORES_BEATMAP_ID_REPLAY_ID: Index = Internal.createIndex(DSL.name("idx_scores_beatmap_id_replay_id"), Scores.SCORES, arrayOf(Scores.SCORES.BEATMAP_ID, Scores.SCORES.REPLAY_ID), false)
|
||||
val IDX_SCORES_BEATMAP_ID_REPLAY_ID_UR: Index = Internal.createIndex(DSL.name("idx_scores_beatmap_id_replay_id_ur"), Scores.SCORES, arrayOf(Scores.SCORES.BEATMAP_ID, Scores.SCORES.REPLAY_ID, Scores.SCORES.UR), false)
|
||||
val IDX_SCORES_JUDGEMENTS_SCORE_ID: Index = Internal.createIndex(DSL.name("idx_scores_judgements_score_id"), ScoresJudgements.SCORES_JUDGEMENTS, arrayOf(ScoresJudgements.SCORES_JUDGEMENTS.SCORE_ID), false)
|
||||
val IDX_SCORES_REPLAY_ID: Index = Internal.createIndex(DSL.name("idx_scores_replay_id"), Scores.SCORES, arrayOf(Scores.SCORES.REPLAY_ID), false)
|
||||
val IDX_SCORES_UR: Index = Internal.createIndex(DSL.name("idx_scores_ur"), Scores.SCORES, arrayOf(Scores.SCORES.UR), false)
|
||||
val IDX_SCORES_USER_ID: Index = Internal.createIndex(DSL.name("idx_scores_user_id"), Scores.SCORES, arrayOf(Scores.SCORES.USER_ID), false)
|
||||
@ -0,0 +1,50 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.keys
|
||||
|
||||
|
||||
import com.nisemoe.generated.tables.Beatmaps
|
||||
import com.nisemoe.generated.tables.FlywaySchemaHistory
|
||||
import com.nisemoe.generated.tables.RedditPost
|
||||
import com.nisemoe.generated.tables.Scores
|
||||
import com.nisemoe.generated.tables.ScoresJudgements
|
||||
import com.nisemoe.generated.tables.ScoresSimilarity
|
||||
import com.nisemoe.generated.tables.UpdateUserQueue
|
||||
import com.nisemoe.generated.tables.Users
|
||||
import com.nisemoe.generated.tables.records.BeatmapsRecord
|
||||
import com.nisemoe.generated.tables.records.FlywaySchemaHistoryRecord
|
||||
import com.nisemoe.generated.tables.records.RedditPostRecord
|
||||
import com.nisemoe.generated.tables.records.ScoresJudgementsRecord
|
||||
import com.nisemoe.generated.tables.records.ScoresRecord
|
||||
import com.nisemoe.generated.tables.records.ScoresSimilarityRecord
|
||||
import com.nisemoe.generated.tables.records.UpdateUserQueueRecord
|
||||
import com.nisemoe.generated.tables.records.UsersRecord
|
||||
|
||||
import org.jooq.ForeignKey
|
||||
import org.jooq.UniqueKey
|
||||
import org.jooq.impl.DSL
|
||||
import org.jooq.impl.Internal
|
||||
|
||||
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// UNIQUE and PRIMARY KEY definitions
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
val BEATMAPS_PKEY: UniqueKey<BeatmapsRecord> = Internal.createUniqueKey(Beatmaps.BEATMAPS, DSL.name("beatmaps_pkey"), arrayOf(Beatmaps.BEATMAPS.BEATMAP_ID), true)
|
||||
val FLYWAY_SCHEMA_HISTORY_PK: UniqueKey<FlywaySchemaHistoryRecord> = Internal.createUniqueKey(FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY, DSL.name("flyway_schema_history_pk"), arrayOf(FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY.INSTALLED_RANK), true)
|
||||
val REDDIT_POST_PKEY: UniqueKey<RedditPostRecord> = Internal.createUniqueKey(RedditPost.REDDIT_POST, DSL.name("reddit_post_pkey"), arrayOf(RedditPost.REDDIT_POST.POST_ID), true)
|
||||
val REPLAY_ID_UNIQUE: UniqueKey<ScoresRecord> = Internal.createUniqueKey(Scores.SCORES, DSL.name("replay_id_unique"), arrayOf(Scores.SCORES.REPLAY_ID), true)
|
||||
val SCORES_PKEY: UniqueKey<ScoresRecord> = Internal.createUniqueKey(Scores.SCORES, DSL.name("scores_pkey"), arrayOf(Scores.SCORES.ID), true)
|
||||
val SCORES_JUDGEMENTS_PKEY: UniqueKey<ScoresJudgementsRecord> = Internal.createUniqueKey(ScoresJudgements.SCORES_JUDGEMENTS, DSL.name("scores_judgements_pkey"), arrayOf(ScoresJudgements.SCORES_JUDGEMENTS.ID), true)
|
||||
val SCORES_SIMILARITY_PKEY: UniqueKey<ScoresSimilarityRecord> = Internal.createUniqueKey(ScoresSimilarity.SCORES_SIMILARITY, DSL.name("scores_similarity_pkey"), arrayOf(ScoresSimilarity.SCORES_SIMILARITY.ID), true)
|
||||
val UNIQUE_BEATMAP_REPLAY_IDS: UniqueKey<ScoresSimilarityRecord> = Internal.createUniqueKey(ScoresSimilarity.SCORES_SIMILARITY, DSL.name("unique_beatmap_replay_ids"), arrayOf(ScoresSimilarity.SCORES_SIMILARITY.BEATMAP_ID, ScoresSimilarity.SCORES_SIMILARITY.REPLAY_ID_1, ScoresSimilarity.SCORES_SIMILARITY.REPLAY_ID_2), true)
|
||||
val UPDATE_USER_QUEUE_PKEY: UniqueKey<UpdateUserQueueRecord> = Internal.createUniqueKey(UpdateUserQueue.UPDATE_USER_QUEUE, DSL.name("update_user_queue_pkey"), arrayOf(UpdateUserQueue.UPDATE_USER_QUEUE.ID), true)
|
||||
val USERS_PKEY: UniqueKey<UsersRecord> = Internal.createUniqueKey(Users.USERS, DSL.name("users_pkey"), arrayOf(Users.USERS.USER_ID), true)
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// FOREIGN KEY definitions
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
val SCORES_JUDGEMENTS__SCORES_JUDGEMENTS_SCORE_ID_FKEY: ForeignKey<ScoresJudgementsRecord, ScoresRecord> = Internal.createForeignKey(ScoresJudgements.SCORES_JUDGEMENTS, DSL.name("scores_judgements_score_id_fkey"), arrayOf(ScoresJudgements.SCORES_JUDGEMENTS.SCORE_ID), com.nisemoe.generated.keys.SCORES_PKEY, arrayOf(Scores.SCORES.ID), true)
|
||||
@ -0,0 +1,63 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.sequences
|
||||
|
||||
|
||||
import com.nisemoe.generated.Public
|
||||
|
||||
import org.jooq.Sequence
|
||||
import org.jooq.impl.Internal
|
||||
import org.jooq.impl.SQLDataType
|
||||
|
||||
|
||||
|
||||
/**
|
||||
* The sequence <code>public.beatmaps_beatmap_id_seq</code>
|
||||
*/
|
||||
val BEATMAPS_BEATMAP_ID_SEQ: Sequence<Long> = Internal.createSequence("beatmaps_beatmap_id_seq", Public.PUBLIC, SQLDataType.BIGINT.nullable(false), null, null, null, 2147483647, false, null)
|
||||
|
||||
/**
|
||||
* The sequence <code>public.beatmaps_beatmap_id_seq1</code>
|
||||
*/
|
||||
val BEATMAPS_BEATMAP_ID_SEQ1: Sequence<Int> = Internal.createSequence("beatmaps_beatmap_id_seq1", Public.PUBLIC, SQLDataType.INTEGER.nullable(false), null, null, null, null, false, null)
|
||||
|
||||
/**
|
||||
* The sequence <code>public.scores_id_seq</code>
|
||||
*/
|
||||
val SCORES_ID_SEQ: Sequence<Long> = Internal.createSequence("scores_id_seq", Public.PUBLIC, SQLDataType.BIGINT.nullable(false), null, null, null, 2147483647, false, null)
|
||||
|
||||
/**
|
||||
* The sequence <code>public.scores_id_seq1</code>
|
||||
*/
|
||||
val SCORES_ID_SEQ1: Sequence<Int> = Internal.createSequence("scores_id_seq1", Public.PUBLIC, SQLDataType.INTEGER.nullable(false), null, null, null, null, false, null)
|
||||
|
||||
/**
|
||||
* The sequence <code>public.scores_judgements_id_seq</code>
|
||||
*/
|
||||
val SCORES_JUDGEMENTS_ID_SEQ: Sequence<Long> = Internal.createSequence("scores_judgements_id_seq", Public.PUBLIC, SQLDataType.BIGINT.nullable(false), null, null, null, 2147483647, false, null)
|
||||
|
||||
/**
|
||||
* The sequence <code>public.scores_judgements_id_seq1</code>
|
||||
*/
|
||||
val SCORES_JUDGEMENTS_ID_SEQ1: Sequence<Int> = Internal.createSequence("scores_judgements_id_seq1", Public.PUBLIC, SQLDataType.INTEGER.nullable(false), null, null, null, null, false, null)
|
||||
|
||||
/**
|
||||
* The sequence <code>public.scores_similarity_id_seq</code>
|
||||
*/
|
||||
val SCORES_SIMILARITY_ID_SEQ: Sequence<Long> = Internal.createSequence("scores_similarity_id_seq", Public.PUBLIC, SQLDataType.BIGINT.nullable(false), null, null, null, 2147483647, false, null)
|
||||
|
||||
/**
|
||||
* The sequence <code>public.scores_similarity_id_seq1</code>
|
||||
*/
|
||||
val SCORES_SIMILARITY_ID_SEQ1: Sequence<Int> = Internal.createSequence("scores_similarity_id_seq1", Public.PUBLIC, SQLDataType.INTEGER.nullable(false), null, null, null, null, false, null)
|
||||
|
||||
/**
|
||||
* The sequence <code>public.users_user_id_seq</code>
|
||||
*/
|
||||
val USERS_USER_ID_SEQ: Sequence<Long> = Internal.createSequence("users_user_id_seq", Public.PUBLIC, SQLDataType.BIGINT.nullable(false), null, null, null, null, false, null)
|
||||
|
||||
/**
|
||||
* The sequence <code>public.users_user_id_seq1</code>
|
||||
*/
|
||||
val USERS_USER_ID_SEQ1: Sequence<Long> = Internal.createSequence("users_user_id_seq1", Public.PUBLIC, SQLDataType.BIGINT.nullable(false), null, null, null, null, false, null)
|
||||
@ -0,0 +1,165 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables
|
||||
|
||||
|
||||
import com.nisemoe.generated.Public
|
||||
import com.nisemoe.generated.keys.BEATMAPS_PKEY
|
||||
import com.nisemoe.generated.tables.records.BeatmapsRecord
|
||||
|
||||
import java.time.LocalDateTime
|
||||
import java.util.function.Function
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.ForeignKey
|
||||
import org.jooq.Name
|
||||
import org.jooq.Record
|
||||
import org.jooq.Records
|
||||
import org.jooq.Row9
|
||||
import org.jooq.Schema
|
||||
import org.jooq.SelectField
|
||||
import org.jooq.Table
|
||||
import org.jooq.TableField
|
||||
import org.jooq.TableOptions
|
||||
import org.jooq.UniqueKey
|
||||
import org.jooq.impl.DSL
|
||||
import org.jooq.impl.Internal
|
||||
import org.jooq.impl.SQLDataType
|
||||
import org.jooq.impl.TableImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class Beatmaps(
|
||||
alias: Name,
|
||||
child: Table<out Record>?,
|
||||
path: ForeignKey<out Record, BeatmapsRecord>?,
|
||||
aliased: Table<BeatmapsRecord>?,
|
||||
parameters: Array<Field<*>?>?
|
||||
): TableImpl<BeatmapsRecord>(
|
||||
alias,
|
||||
Public.PUBLIC,
|
||||
child,
|
||||
path,
|
||||
aliased,
|
||||
parameters,
|
||||
DSL.comment(""),
|
||||
TableOptions.table()
|
||||
) {
|
||||
companion object {
|
||||
|
||||
/**
|
||||
* The reference instance of <code>public.beatmaps</code>
|
||||
*/
|
||||
val BEATMAPS: Beatmaps = Beatmaps()
|
||||
}
|
||||
|
||||
/**
|
||||
* The class holding records for this type
|
||||
*/
|
||||
override fun getRecordType(): Class<BeatmapsRecord> = BeatmapsRecord::class.java
|
||||
|
||||
/**
|
||||
* The column <code>public.beatmaps.beatmap_id</code>.
|
||||
*/
|
||||
val BEATMAP_ID: TableField<BeatmapsRecord, Int?> = createField(DSL.name("beatmap_id"), SQLDataType.INTEGER.nullable(false).defaultValue(DSL.field(DSL.raw("nextval('beatmaps_beatmap_id_seq1'::regclass)"), SQLDataType.INTEGER)), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.beatmaps.artist</code>.
|
||||
*/
|
||||
val ARTIST: TableField<BeatmapsRecord, String?> = createField(DSL.name("artist"), SQLDataType.VARCHAR, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.beatmaps.beatmapset_id</code>.
|
||||
*/
|
||||
val BEATMAPSET_ID: TableField<BeatmapsRecord, Int?> = createField(DSL.name("beatmapset_id"), SQLDataType.INTEGER, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.beatmaps.creator</code>.
|
||||
*/
|
||||
val CREATOR: TableField<BeatmapsRecord, String?> = createField(DSL.name("creator"), SQLDataType.VARCHAR, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.beatmaps.source</code>.
|
||||
*/
|
||||
val SOURCE: TableField<BeatmapsRecord, String?> = createField(DSL.name("source"), SQLDataType.VARCHAR, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.beatmaps.star_rating</code>.
|
||||
*/
|
||||
val STAR_RATING: TableField<BeatmapsRecord, Double?> = createField(DSL.name("star_rating"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.beatmaps.title</code>.
|
||||
*/
|
||||
val TITLE: TableField<BeatmapsRecord, String?> = createField(DSL.name("title"), SQLDataType.VARCHAR, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.beatmaps.version</code>.
|
||||
*/
|
||||
val VERSION: TableField<BeatmapsRecord, String?> = createField(DSL.name("version"), SQLDataType.VARCHAR, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.beatmaps.sys_last_update</code>.
|
||||
*/
|
||||
val SYS_LAST_UPDATE: TableField<BeatmapsRecord, LocalDateTime?> = createField(DSL.name("sys_last_update"), SQLDataType.LOCALDATETIME(6), this, "")
|
||||
|
||||
private constructor(alias: Name, aliased: Table<BeatmapsRecord>?): this(alias, null, null, aliased, null)
|
||||
private constructor(alias: Name, aliased: Table<BeatmapsRecord>?, parameters: Array<Field<*>?>?): this(alias, null, null, aliased, parameters)
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.beatmaps</code> table reference
|
||||
*/
|
||||
constructor(alias: String): this(DSL.name(alias))
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.beatmaps</code> table reference
|
||||
*/
|
||||
constructor(alias: Name): this(alias, null)
|
||||
|
||||
/**
|
||||
* Create a <code>public.beatmaps</code> table reference
|
||||
*/
|
||||
constructor(): this(DSL.name("beatmaps"), null)
|
||||
|
||||
constructor(child: Table<out Record>, key: ForeignKey<out Record, BeatmapsRecord>): this(Internal.createPathAlias(child, key), child, key, BEATMAPS, null)
|
||||
override fun getSchema(): Schema? = if (aliased()) null else Public.PUBLIC
|
||||
override fun getPrimaryKey(): UniqueKey<BeatmapsRecord> = BEATMAPS_PKEY
|
||||
override fun `as`(alias: String): Beatmaps = Beatmaps(DSL.name(alias), this)
|
||||
override fun `as`(alias: Name): Beatmaps = Beatmaps(alias, this)
|
||||
override fun `as`(alias: Table<*>): Beatmaps = Beatmaps(alias.getQualifiedName(), this)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: String): Beatmaps = Beatmaps(DSL.name(name), null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Name): Beatmaps = Beatmaps(name, null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Table<*>): Beatmaps = Beatmaps(name.getQualifiedName(), null)
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Row9 type methods
|
||||
// -------------------------------------------------------------------------
|
||||
override fun fieldsRow(): Row9<Int?, String?, Int?, String?, String?, Double?, String?, String?, LocalDateTime?> = super.fieldsRow() as Row9<Int?, String?, Int?, String?, String?, Double?, String?, String?, LocalDateTime?>
|
||||
|
||||
/**
|
||||
* Convenience mapping calling {@link SelectField#convertFrom(Function)}.
|
||||
*/
|
||||
fun <U> mapping(from: (Int?, String?, Int?, String?, String?, Double?, String?, String?, LocalDateTime?) -> U): SelectField<U> = convertFrom(Records.mapping(from))
|
||||
|
||||
/**
|
||||
* Convenience mapping calling {@link SelectField#convertFrom(Class,
|
||||
* Function)}.
|
||||
*/
|
||||
fun <U> mapping(toType: Class<U>, from: (Int?, String?, Int?, String?, String?, Double?, String?, String?, LocalDateTime?) -> U): SelectField<U> = convertFrom(toType, Records.mapping(from))
|
||||
}
|
||||
@ -0,0 +1,177 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables
|
||||
|
||||
|
||||
import com.nisemoe.generated.Public
|
||||
import com.nisemoe.generated.indexes.FLYWAY_SCHEMA_HISTORY_S_IDX
|
||||
import com.nisemoe.generated.keys.FLYWAY_SCHEMA_HISTORY_PK
|
||||
import com.nisemoe.generated.tables.records.FlywaySchemaHistoryRecord
|
||||
|
||||
import java.time.LocalDateTime
|
||||
import java.util.function.Function
|
||||
|
||||
import kotlin.collections.List
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.ForeignKey
|
||||
import org.jooq.Index
|
||||
import org.jooq.Name
|
||||
import org.jooq.Record
|
||||
import org.jooq.Records
|
||||
import org.jooq.Row10
|
||||
import org.jooq.Schema
|
||||
import org.jooq.SelectField
|
||||
import org.jooq.Table
|
||||
import org.jooq.TableField
|
||||
import org.jooq.TableOptions
|
||||
import org.jooq.UniqueKey
|
||||
import org.jooq.impl.DSL
|
||||
import org.jooq.impl.Internal
|
||||
import org.jooq.impl.SQLDataType
|
||||
import org.jooq.impl.TableImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class FlywaySchemaHistory(
|
||||
alias: Name,
|
||||
child: Table<out Record>?,
|
||||
path: ForeignKey<out Record, FlywaySchemaHistoryRecord>?,
|
||||
aliased: Table<FlywaySchemaHistoryRecord>?,
|
||||
parameters: Array<Field<*>?>?
|
||||
): TableImpl<FlywaySchemaHistoryRecord>(
|
||||
alias,
|
||||
Public.PUBLIC,
|
||||
child,
|
||||
path,
|
||||
aliased,
|
||||
parameters,
|
||||
DSL.comment(""),
|
||||
TableOptions.table()
|
||||
) {
|
||||
companion object {
|
||||
|
||||
/**
|
||||
* The reference instance of <code>public.flyway_schema_history</code>
|
||||
*/
|
||||
val FLYWAY_SCHEMA_HISTORY: FlywaySchemaHistory = FlywaySchemaHistory()
|
||||
}
|
||||
|
||||
/**
|
||||
* The class holding records for this type
|
||||
*/
|
||||
override fun getRecordType(): Class<FlywaySchemaHistoryRecord> = FlywaySchemaHistoryRecord::class.java
|
||||
|
||||
/**
|
||||
* The column <code>public.flyway_schema_history.installed_rank</code>.
|
||||
*/
|
||||
val INSTALLED_RANK: TableField<FlywaySchemaHistoryRecord, Int?> = createField(DSL.name("installed_rank"), SQLDataType.INTEGER.nullable(false), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.flyway_schema_history.version</code>.
|
||||
*/
|
||||
val VERSION: TableField<FlywaySchemaHistoryRecord, String?> = createField(DSL.name("version"), SQLDataType.VARCHAR(50), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.flyway_schema_history.description</code>.
|
||||
*/
|
||||
val DESCRIPTION: TableField<FlywaySchemaHistoryRecord, String?> = createField(DSL.name("description"), SQLDataType.VARCHAR(200).nullable(false), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.flyway_schema_history.type</code>.
|
||||
*/
|
||||
val TYPE: TableField<FlywaySchemaHistoryRecord, String?> = createField(DSL.name("type"), SQLDataType.VARCHAR(20).nullable(false), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.flyway_schema_history.script</code>.
|
||||
*/
|
||||
val SCRIPT: TableField<FlywaySchemaHistoryRecord, String?> = createField(DSL.name("script"), SQLDataType.VARCHAR(1000).nullable(false), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.flyway_schema_history.checksum</code>.
|
||||
*/
|
||||
val CHECKSUM: TableField<FlywaySchemaHistoryRecord, Int?> = createField(DSL.name("checksum"), SQLDataType.INTEGER, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.flyway_schema_history.installed_by</code>.
|
||||
*/
|
||||
val INSTALLED_BY: TableField<FlywaySchemaHistoryRecord, String?> = createField(DSL.name("installed_by"), SQLDataType.VARCHAR(100).nullable(false), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.flyway_schema_history.installed_on</code>.
|
||||
*/
|
||||
val INSTALLED_ON: TableField<FlywaySchemaHistoryRecord, LocalDateTime?> = createField(DSL.name("installed_on"), SQLDataType.LOCALDATETIME(6).nullable(false).defaultValue(DSL.field(DSL.raw("now()"), SQLDataType.LOCALDATETIME)), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.flyway_schema_history.execution_time</code>.
|
||||
*/
|
||||
val EXECUTION_TIME: TableField<FlywaySchemaHistoryRecord, Int?> = createField(DSL.name("execution_time"), SQLDataType.INTEGER.nullable(false), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.flyway_schema_history.success</code>.
|
||||
*/
|
||||
val SUCCESS: TableField<FlywaySchemaHistoryRecord, Boolean?> = createField(DSL.name("success"), SQLDataType.BOOLEAN.nullable(false), this, "")
|
||||
|
||||
private constructor(alias: Name, aliased: Table<FlywaySchemaHistoryRecord>?): this(alias, null, null, aliased, null)
|
||||
private constructor(alias: Name, aliased: Table<FlywaySchemaHistoryRecord>?, parameters: Array<Field<*>?>?): this(alias, null, null, aliased, parameters)
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.flyway_schema_history</code> table
|
||||
* reference
|
||||
*/
|
||||
constructor(alias: String): this(DSL.name(alias))
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.flyway_schema_history</code> table
|
||||
* reference
|
||||
*/
|
||||
constructor(alias: Name): this(alias, null)
|
||||
|
||||
/**
|
||||
* Create a <code>public.flyway_schema_history</code> table reference
|
||||
*/
|
||||
constructor(): this(DSL.name("flyway_schema_history"), null)
|
||||
|
||||
constructor(child: Table<out Record>, key: ForeignKey<out Record, FlywaySchemaHistoryRecord>): this(Internal.createPathAlias(child, key), child, key, FLYWAY_SCHEMA_HISTORY, null)
|
||||
override fun getSchema(): Schema? = if (aliased()) null else Public.PUBLIC
|
||||
override fun getIndexes(): List<Index> = listOf(FLYWAY_SCHEMA_HISTORY_S_IDX)
|
||||
override fun getPrimaryKey(): UniqueKey<FlywaySchemaHistoryRecord> = FLYWAY_SCHEMA_HISTORY_PK
|
||||
override fun `as`(alias: String): FlywaySchemaHistory = FlywaySchemaHistory(DSL.name(alias), this)
|
||||
override fun `as`(alias: Name): FlywaySchemaHistory = FlywaySchemaHistory(alias, this)
|
||||
override fun `as`(alias: Table<*>): FlywaySchemaHistory = FlywaySchemaHistory(alias.getQualifiedName(), this)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: String): FlywaySchemaHistory = FlywaySchemaHistory(DSL.name(name), null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Name): FlywaySchemaHistory = FlywaySchemaHistory(name, null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Table<*>): FlywaySchemaHistory = FlywaySchemaHistory(name.getQualifiedName(), null)
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Row10 type methods
|
||||
// -------------------------------------------------------------------------
|
||||
override fun fieldsRow(): Row10<Int?, String?, String?, String?, String?, Int?, String?, LocalDateTime?, Int?, Boolean?> = super.fieldsRow() as Row10<Int?, String?, String?, String?, String?, Int?, String?, LocalDateTime?, Int?, Boolean?>
|
||||
|
||||
/**
|
||||
* Convenience mapping calling {@link SelectField#convertFrom(Function)}.
|
||||
*/
|
||||
fun <U> mapping(from: (Int?, String?, String?, String?, String?, Int?, String?, LocalDateTime?, Int?, Boolean?) -> U): SelectField<U> = convertFrom(Records.mapping(from))
|
||||
|
||||
/**
|
||||
* Convenience mapping calling {@link SelectField#convertFrom(Class,
|
||||
* Function)}.
|
||||
*/
|
||||
fun <U> mapping(toType: Class<U>, from: (Int?, String?, String?, String?, String?, Int?, String?, LocalDateTime?, Int?, Boolean?) -> U): SelectField<U> = convertFrom(toType, Records.mapping(from))
|
||||
}
|
||||
@ -0,0 +1,144 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables
|
||||
|
||||
|
||||
import com.nisemoe.generated.Public
|
||||
import com.nisemoe.generated.keys.REDDIT_POST_PKEY
|
||||
import com.nisemoe.generated.tables.records.RedditPostRecord
|
||||
|
||||
import java.util.function.Function
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.ForeignKey
|
||||
import org.jooq.Name
|
||||
import org.jooq.Record
|
||||
import org.jooq.Records
|
||||
import org.jooq.Row5
|
||||
import org.jooq.Schema
|
||||
import org.jooq.SelectField
|
||||
import org.jooq.Table
|
||||
import org.jooq.TableField
|
||||
import org.jooq.TableOptions
|
||||
import org.jooq.UniqueKey
|
||||
import org.jooq.impl.DSL
|
||||
import org.jooq.impl.Internal
|
||||
import org.jooq.impl.SQLDataType
|
||||
import org.jooq.impl.TableImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class RedditPost(
|
||||
alias: Name,
|
||||
child: Table<out Record>?,
|
||||
path: ForeignKey<out Record, RedditPostRecord>?,
|
||||
aliased: Table<RedditPostRecord>?,
|
||||
parameters: Array<Field<*>?>?
|
||||
): TableImpl<RedditPostRecord>(
|
||||
alias,
|
||||
Public.PUBLIC,
|
||||
child,
|
||||
path,
|
||||
aliased,
|
||||
parameters,
|
||||
DSL.comment(""),
|
||||
TableOptions.table()
|
||||
) {
|
||||
companion object {
|
||||
|
||||
/**
|
||||
* The reference instance of <code>public.reddit_post</code>
|
||||
*/
|
||||
val REDDIT_POST: RedditPost = RedditPost()
|
||||
}
|
||||
|
||||
/**
|
||||
* The class holding records for this type
|
||||
*/
|
||||
override fun getRecordType(): Class<RedditPostRecord> = RedditPostRecord::class.java
|
||||
|
||||
/**
|
||||
* The column <code>public.reddit_post.post_id</code>.
|
||||
*/
|
||||
val POST_ID: TableField<RedditPostRecord, String?> = createField(DSL.name("post_id"), SQLDataType.VARCHAR.nullable(false), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.reddit_post.title</code>.
|
||||
*/
|
||||
val TITLE: TableField<RedditPostRecord, String?> = createField(DSL.name("title"), SQLDataType.VARCHAR, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.reddit_post.created_utc</code>.
|
||||
*/
|
||||
val CREATED_UTC: TableField<RedditPostRecord, Double?> = createField(DSL.name("created_utc"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.reddit_post.url</code>.
|
||||
*/
|
||||
val URL: TableField<RedditPostRecord, String?> = createField(DSL.name("url"), SQLDataType.VARCHAR, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.reddit_post.is_checked</code>.
|
||||
*/
|
||||
val IS_CHECKED: TableField<RedditPostRecord, Boolean?> = createField(DSL.name("is_checked"), SQLDataType.BOOLEAN.defaultValue(DSL.field(DSL.raw("false"), SQLDataType.BOOLEAN)), this, "")
|
||||
|
||||
private constructor(alias: Name, aliased: Table<RedditPostRecord>?): this(alias, null, null, aliased, null)
|
||||
private constructor(alias: Name, aliased: Table<RedditPostRecord>?, parameters: Array<Field<*>?>?): this(alias, null, null, aliased, parameters)
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.reddit_post</code> table reference
|
||||
*/
|
||||
constructor(alias: String): this(DSL.name(alias))
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.reddit_post</code> table reference
|
||||
*/
|
||||
constructor(alias: Name): this(alias, null)
|
||||
|
||||
/**
|
||||
* Create a <code>public.reddit_post</code> table reference
|
||||
*/
|
||||
constructor(): this(DSL.name("reddit_post"), null)
|
||||
|
||||
constructor(child: Table<out Record>, key: ForeignKey<out Record, RedditPostRecord>): this(Internal.createPathAlias(child, key), child, key, REDDIT_POST, null)
|
||||
override fun getSchema(): Schema? = if (aliased()) null else Public.PUBLIC
|
||||
override fun getPrimaryKey(): UniqueKey<RedditPostRecord> = REDDIT_POST_PKEY
|
||||
override fun `as`(alias: String): RedditPost = RedditPost(DSL.name(alias), this)
|
||||
override fun `as`(alias: Name): RedditPost = RedditPost(alias, this)
|
||||
override fun `as`(alias: Table<*>): RedditPost = RedditPost(alias.getQualifiedName(), this)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: String): RedditPost = RedditPost(DSL.name(name), null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Name): RedditPost = RedditPost(name, null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Table<*>): RedditPost = RedditPost(name.getQualifiedName(), null)
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Row5 type methods
|
||||
// -------------------------------------------------------------------------
|
||||
override fun fieldsRow(): Row5<String?, String?, Double?, String?, Boolean?> = super.fieldsRow() as Row5<String?, String?, Double?, String?, Boolean?>
|
||||
|
||||
/**
|
||||
* Convenience mapping calling {@link SelectField#convertFrom(Function)}.
|
||||
*/
|
||||
fun <U> mapping(from: (String?, String?, Double?, String?, Boolean?) -> U): SelectField<U> = convertFrom(Records.mapping(from))
|
||||
|
||||
/**
|
||||
* Convenience mapping calling {@link SelectField#convertFrom(Class,
|
||||
* Function)}.
|
||||
*/
|
||||
fun <U> mapping(toType: Class<U>, from: (String?, String?, Double?, String?, Boolean?) -> U): SelectField<U> = convertFrom(toType, Records.mapping(from))
|
||||
}
|
||||
@ -0,0 +1,288 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables
|
||||
|
||||
|
||||
import com.nisemoe.generated.Public
|
||||
import com.nisemoe.generated.indexes.IDX_SCORES_BEATMAP_ID
|
||||
import com.nisemoe.generated.indexes.IDX_SCORES_BEATMAP_ID_REPLAY_ID
|
||||
import com.nisemoe.generated.indexes.IDX_SCORES_BEATMAP_ID_REPLAY_ID_UR
|
||||
import com.nisemoe.generated.indexes.IDX_SCORES_REPLAY_ID
|
||||
import com.nisemoe.generated.indexes.IDX_SCORES_UR
|
||||
import com.nisemoe.generated.indexes.IDX_SCORES_USER_ID
|
||||
import com.nisemoe.generated.keys.REPLAY_ID_UNIQUE
|
||||
import com.nisemoe.generated.keys.SCORES_PKEY
|
||||
import com.nisemoe.generated.tables.records.ScoresRecord
|
||||
|
||||
import java.time.LocalDateTime
|
||||
import java.time.OffsetDateTime
|
||||
|
||||
import kotlin.collections.List
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.ForeignKey
|
||||
import org.jooq.Index
|
||||
import org.jooq.Name
|
||||
import org.jooq.Record
|
||||
import org.jooq.Schema
|
||||
import org.jooq.Table
|
||||
import org.jooq.TableField
|
||||
import org.jooq.TableOptions
|
||||
import org.jooq.UniqueKey
|
||||
import org.jooq.impl.DSL
|
||||
import org.jooq.impl.Internal
|
||||
import org.jooq.impl.SQLDataType
|
||||
import org.jooq.impl.TableImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class Scores(
|
||||
alias: Name,
|
||||
child: Table<out Record>?,
|
||||
path: ForeignKey<out Record, ScoresRecord>?,
|
||||
aliased: Table<ScoresRecord>?,
|
||||
parameters: Array<Field<*>?>?
|
||||
): TableImpl<ScoresRecord>(
|
||||
alias,
|
||||
Public.PUBLIC,
|
||||
child,
|
||||
path,
|
||||
aliased,
|
||||
parameters,
|
||||
DSL.comment(""),
|
||||
TableOptions.table()
|
||||
) {
|
||||
companion object {
|
||||
|
||||
/**
|
||||
* The reference instance of <code>public.scores</code>
|
||||
*/
|
||||
val SCORES: Scores = Scores()
|
||||
}
|
||||
|
||||
/**
|
||||
* The class holding records for this type
|
||||
*/
|
||||
override fun getRecordType(): Class<ScoresRecord> = ScoresRecord::class.java
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.id</code>.
|
||||
*/
|
||||
val ID: TableField<ScoresRecord, Int?> = createField(DSL.name("id"), SQLDataType.INTEGER.nullable(false).defaultValue(DSL.field(DSL.raw("nextval('scores_id_seq1'::regclass)"), SQLDataType.INTEGER)), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.beatmap_id</code>.
|
||||
*/
|
||||
val BEATMAP_ID: TableField<ScoresRecord, Int?> = createField(DSL.name("beatmap_id"), SQLDataType.INTEGER, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.count_100</code>.
|
||||
*/
|
||||
val COUNT_100: TableField<ScoresRecord, Int?> = createField(DSL.name("count_100"), SQLDataType.INTEGER, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.count_300</code>.
|
||||
*/
|
||||
val COUNT_300: TableField<ScoresRecord, Int?> = createField(DSL.name("count_300"), SQLDataType.INTEGER, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.count_50</code>.
|
||||
*/
|
||||
val COUNT_50: TableField<ScoresRecord, Int?> = createField(DSL.name("count_50"), SQLDataType.INTEGER, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.count_miss</code>.
|
||||
*/
|
||||
val COUNT_MISS: TableField<ScoresRecord, Int?> = createField(DSL.name("count_miss"), SQLDataType.INTEGER, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.date</code>.
|
||||
*/
|
||||
val DATE: TableField<ScoresRecord, LocalDateTime?> = createField(DSL.name("date"), SQLDataType.LOCALDATETIME(6), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.max_combo</code>.
|
||||
*/
|
||||
val MAX_COMBO: TableField<ScoresRecord, Int?> = createField(DSL.name("max_combo"), SQLDataType.INTEGER, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.mods</code>.
|
||||
*/
|
||||
val MODS: TableField<ScoresRecord, Int?> = createField(DSL.name("mods"), SQLDataType.INTEGER, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.perfect</code>.
|
||||
*/
|
||||
val PERFECT: TableField<ScoresRecord, Boolean?> = createField(DSL.name("perfect"), SQLDataType.BOOLEAN, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.pp</code>.
|
||||
*/
|
||||
val PP: TableField<ScoresRecord, Double?> = createField(DSL.name("pp"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.rank</code>.
|
||||
*/
|
||||
val RANK: TableField<ScoresRecord, String?> = createField(DSL.name("rank"), SQLDataType.VARCHAR, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.replay_available</code>.
|
||||
*/
|
||||
val REPLAY_AVAILABLE: TableField<ScoresRecord, Boolean?> = createField(DSL.name("replay_available"), SQLDataType.BOOLEAN, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.replay_id</code>.
|
||||
*/
|
||||
val REPLAY_ID: TableField<ScoresRecord, Long?> = createField(DSL.name("replay_id"), SQLDataType.BIGINT, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.score</code>.
|
||||
*/
|
||||
val SCORE: TableField<ScoresRecord, Long?> = createField(DSL.name("score"), SQLDataType.BIGINT, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.user_id</code>.
|
||||
*/
|
||||
val USER_ID: TableField<ScoresRecord, Long?> = createField(DSL.name("user_id"), SQLDataType.BIGINT, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.replay</code>.
|
||||
*/
|
||||
val REPLAY: TableField<ScoresRecord, ByteArray?> = createField(DSL.name("replay"), SQLDataType.BLOB, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.ur</code>.
|
||||
*/
|
||||
val UR: TableField<ScoresRecord, Double?> = createField(DSL.name("ur"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.frametime</code>.
|
||||
*/
|
||||
val FRAMETIME: TableField<ScoresRecord, Double?> = createField(DSL.name("frametime"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.edge_hits</code>.
|
||||
*/
|
||||
val EDGE_HITS: TableField<ScoresRecord, Int?> = createField(DSL.name("edge_hits"), SQLDataType.INTEGER, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.snaps</code>.
|
||||
*/
|
||||
val SNAPS: TableField<ScoresRecord, Int?> = createField(DSL.name("snaps"), SQLDataType.INTEGER, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.is_banned</code>.
|
||||
*/
|
||||
val IS_BANNED: TableField<ScoresRecord, Boolean?> = createField(DSL.name("is_banned"), SQLDataType.BOOLEAN.defaultValue(DSL.field(DSL.raw("false"), SQLDataType.BOOLEAN)), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.adjusted_ur</code>.
|
||||
*/
|
||||
val ADJUSTED_UR: TableField<ScoresRecord, Double?> = createField(DSL.name("adjusted_ur"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.mean_error</code>.
|
||||
*/
|
||||
val MEAN_ERROR: TableField<ScoresRecord, Double?> = createField(DSL.name("mean_error"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.error_variance</code>.
|
||||
*/
|
||||
val ERROR_VARIANCE: TableField<ScoresRecord, Double?> = createField(DSL.name("error_variance"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.error_standard_deviation</code>.
|
||||
*/
|
||||
val ERROR_STANDARD_DEVIATION: TableField<ScoresRecord, Double?> = createField(DSL.name("error_standard_deviation"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.minimum_error</code>.
|
||||
*/
|
||||
val MINIMUM_ERROR: TableField<ScoresRecord, Double?> = createField(DSL.name("minimum_error"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.maximum_error</code>.
|
||||
*/
|
||||
val MAXIMUM_ERROR: TableField<ScoresRecord, Double?> = createField(DSL.name("maximum_error"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.error_range</code>.
|
||||
*/
|
||||
val ERROR_RANGE: TableField<ScoresRecord, Double?> = createField(DSL.name("error_range"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.error_coefficient_of_variation</code>.
|
||||
*/
|
||||
val ERROR_COEFFICIENT_OF_VARIATION: TableField<ScoresRecord, Double?> = createField(DSL.name("error_coefficient_of_variation"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.error_kurtosis</code>.
|
||||
*/
|
||||
val ERROR_KURTOSIS: TableField<ScoresRecord, Double?> = createField(DSL.name("error_kurtosis"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.error_skewness</code>.
|
||||
*/
|
||||
val ERROR_SKEWNESS: TableField<ScoresRecord, Double?> = createField(DSL.name("error_skewness"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.sent_discord_notification</code>.
|
||||
*/
|
||||
val SENT_DISCORD_NOTIFICATION: TableField<ScoresRecord, Boolean?> = createField(DSL.name("sent_discord_notification"), SQLDataType.BOOLEAN, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.added_at</code>.
|
||||
*/
|
||||
val ADDED_AT: TableField<ScoresRecord, OffsetDateTime?> = createField(DSL.name("added_at"), SQLDataType.TIMESTAMPWITHTIMEZONE(6).defaultValue(DSL.field(DSL.raw("CURRENT_TIMESTAMP"), SQLDataType.TIMESTAMPWITHTIMEZONE)), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores.version</code>.
|
||||
*/
|
||||
val VERSION: TableField<ScoresRecord, Int?> = createField(DSL.name("version"), SQLDataType.INTEGER.defaultValue(DSL.field(DSL.raw("0"), SQLDataType.INTEGER)), this, "")
|
||||
|
||||
private constructor(alias: Name, aliased: Table<ScoresRecord>?): this(alias, null, null, aliased, null)
|
||||
private constructor(alias: Name, aliased: Table<ScoresRecord>?, parameters: Array<Field<*>?>?): this(alias, null, null, aliased, parameters)
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.scores</code> table reference
|
||||
*/
|
||||
constructor(alias: String): this(DSL.name(alias))
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.scores</code> table reference
|
||||
*/
|
||||
constructor(alias: Name): this(alias, null)
|
||||
|
||||
/**
|
||||
* Create a <code>public.scores</code> table reference
|
||||
*/
|
||||
constructor(): this(DSL.name("scores"), null)
|
||||
|
||||
constructor(child: Table<out Record>, key: ForeignKey<out Record, ScoresRecord>): this(Internal.createPathAlias(child, key), child, key, SCORES, null)
|
||||
override fun getSchema(): Schema? = if (aliased()) null else Public.PUBLIC
|
||||
override fun getIndexes(): List<Index> = listOf(IDX_SCORES_BEATMAP_ID, IDX_SCORES_BEATMAP_ID_REPLAY_ID, IDX_SCORES_BEATMAP_ID_REPLAY_ID_UR, IDX_SCORES_REPLAY_ID, IDX_SCORES_UR, IDX_SCORES_USER_ID)
|
||||
override fun getPrimaryKey(): UniqueKey<ScoresRecord> = SCORES_PKEY
|
||||
override fun getUniqueKeys(): List<UniqueKey<ScoresRecord>> = listOf(REPLAY_ID_UNIQUE)
|
||||
override fun `as`(alias: String): Scores = Scores(DSL.name(alias), this)
|
||||
override fun `as`(alias: Name): Scores = Scores(alias, this)
|
||||
override fun `as`(alias: Table<*>): Scores = Scores(alias.getQualifiedName(), this)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: String): Scores = Scores(DSL.name(name), null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Name): Scores = Scores(name, null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Table<*>): Scores = Scores(name.getQualifiedName(), null)
|
||||
}
|
||||
@ -0,0 +1,187 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables
|
||||
|
||||
|
||||
import com.nisemoe.generated.Public
|
||||
import com.nisemoe.generated.enums.JudgementType
|
||||
import com.nisemoe.generated.indexes.IDX_SCORES_JUDGEMENTS_SCORE_ID
|
||||
import com.nisemoe.generated.keys.SCORES_JUDGEMENTS_PKEY
|
||||
import com.nisemoe.generated.keys.SCORES_JUDGEMENTS__SCORES_JUDGEMENTS_SCORE_ID_FKEY
|
||||
import com.nisemoe.generated.tables.records.ScoresJudgementsRecord
|
||||
|
||||
import java.util.function.Function
|
||||
|
||||
import kotlin.collections.List
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.ForeignKey
|
||||
import org.jooq.Index
|
||||
import org.jooq.Name
|
||||
import org.jooq.Record
|
||||
import org.jooq.Records
|
||||
import org.jooq.Row9
|
||||
import org.jooq.Schema
|
||||
import org.jooq.SelectField
|
||||
import org.jooq.Table
|
||||
import org.jooq.TableField
|
||||
import org.jooq.TableOptions
|
||||
import org.jooq.UniqueKey
|
||||
import org.jooq.impl.DSL
|
||||
import org.jooq.impl.Internal
|
||||
import org.jooq.impl.SQLDataType
|
||||
import org.jooq.impl.TableImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class ScoresJudgements(
|
||||
alias: Name,
|
||||
child: Table<out Record>?,
|
||||
path: ForeignKey<out Record, ScoresJudgementsRecord>?,
|
||||
aliased: Table<ScoresJudgementsRecord>?,
|
||||
parameters: Array<Field<*>?>?
|
||||
): TableImpl<ScoresJudgementsRecord>(
|
||||
alias,
|
||||
Public.PUBLIC,
|
||||
child,
|
||||
path,
|
||||
aliased,
|
||||
parameters,
|
||||
DSL.comment(""),
|
||||
TableOptions.table()
|
||||
) {
|
||||
companion object {
|
||||
|
||||
/**
|
||||
* The reference instance of <code>public.scores_judgements</code>
|
||||
*/
|
||||
val SCORES_JUDGEMENTS: ScoresJudgements = ScoresJudgements()
|
||||
}
|
||||
|
||||
/**
|
||||
* The class holding records for this type
|
||||
*/
|
||||
override fun getRecordType(): Class<ScoresJudgementsRecord> = ScoresJudgementsRecord::class.java
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_judgements.id</code>.
|
||||
*/
|
||||
val ID: TableField<ScoresJudgementsRecord, Int?> = createField(DSL.name("id"), SQLDataType.INTEGER.nullable(false).defaultValue(DSL.field(DSL.raw("nextval('scores_judgements_id_seq1'::regclass)"), SQLDataType.INTEGER)), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_judgements.time</code>.
|
||||
*/
|
||||
val TIME: TableField<ScoresJudgementsRecord, Double?> = createField(DSL.name("time"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_judgements.x</code>.
|
||||
*/
|
||||
val X: TableField<ScoresJudgementsRecord, Double?> = createField(DSL.name("x"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_judgements.y</code>.
|
||||
*/
|
||||
val Y: TableField<ScoresJudgementsRecord, Double?> = createField(DSL.name("y"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_judgements.type</code>.
|
||||
*/
|
||||
val TYPE: TableField<ScoresJudgementsRecord, JudgementType?> = createField(DSL.name("type"), SQLDataType.VARCHAR.asEnumDataType(com.nisemoe.generated.enums.JudgementType::class.java), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_judgements.distance_center</code>.
|
||||
*/
|
||||
val DISTANCE_CENTER: TableField<ScoresJudgementsRecord, Double?> = createField(DSL.name("distance_center"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_judgements.distance_edge</code>.
|
||||
*/
|
||||
val DISTANCE_EDGE: TableField<ScoresJudgementsRecord, Double?> = createField(DSL.name("distance_edge"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_judgements.error</code>.
|
||||
*/
|
||||
val ERROR: TableField<ScoresJudgementsRecord, Double?> = createField(DSL.name("error"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_judgements.score_id</code>.
|
||||
*/
|
||||
val SCORE_ID: TableField<ScoresJudgementsRecord, Int?> = createField(DSL.name("score_id"), SQLDataType.INTEGER, this, "")
|
||||
|
||||
private constructor(alias: Name, aliased: Table<ScoresJudgementsRecord>?): this(alias, null, null, aliased, null)
|
||||
private constructor(alias: Name, aliased: Table<ScoresJudgementsRecord>?, parameters: Array<Field<*>?>?): this(alias, null, null, aliased, parameters)
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.scores_judgements</code> table reference
|
||||
*/
|
||||
constructor(alias: String): this(DSL.name(alias))
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.scores_judgements</code> table reference
|
||||
*/
|
||||
constructor(alias: Name): this(alias, null)
|
||||
|
||||
/**
|
||||
* Create a <code>public.scores_judgements</code> table reference
|
||||
*/
|
||||
constructor(): this(DSL.name("scores_judgements"), null)
|
||||
|
||||
constructor(child: Table<out Record>, key: ForeignKey<out Record, ScoresJudgementsRecord>): this(Internal.createPathAlias(child, key), child, key, SCORES_JUDGEMENTS, null)
|
||||
override fun getSchema(): Schema? = if (aliased()) null else Public.PUBLIC
|
||||
override fun getIndexes(): List<Index> = listOf(IDX_SCORES_JUDGEMENTS_SCORE_ID)
|
||||
override fun getPrimaryKey(): UniqueKey<ScoresJudgementsRecord> = SCORES_JUDGEMENTS_PKEY
|
||||
override fun getReferences(): List<ForeignKey<ScoresJudgementsRecord, *>> = listOf(SCORES_JUDGEMENTS__SCORES_JUDGEMENTS_SCORE_ID_FKEY)
|
||||
|
||||
private lateinit var _scores: Scores
|
||||
|
||||
/**
|
||||
* Get the implicit join path to the <code>public.scores</code> table.
|
||||
*/
|
||||
fun scores(): Scores {
|
||||
if (!this::_scores.isInitialized)
|
||||
_scores = Scores(this, SCORES_JUDGEMENTS__SCORES_JUDGEMENTS_SCORE_ID_FKEY)
|
||||
|
||||
return _scores;
|
||||
}
|
||||
|
||||
val scores: Scores
|
||||
get(): Scores = scores()
|
||||
override fun `as`(alias: String): ScoresJudgements = ScoresJudgements(DSL.name(alias), this)
|
||||
override fun `as`(alias: Name): ScoresJudgements = ScoresJudgements(alias, this)
|
||||
override fun `as`(alias: Table<*>): ScoresJudgements = ScoresJudgements(alias.getQualifiedName(), this)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: String): ScoresJudgements = ScoresJudgements(DSL.name(name), null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Name): ScoresJudgements = ScoresJudgements(name, null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Table<*>): ScoresJudgements = ScoresJudgements(name.getQualifiedName(), null)
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Row9 type methods
|
||||
// -------------------------------------------------------------------------
|
||||
override fun fieldsRow(): Row9<Int?, Double?, Double?, Double?, JudgementType?, Double?, Double?, Double?, Int?> = super.fieldsRow() as Row9<Int?, Double?, Double?, Double?, JudgementType?, Double?, Double?, Double?, Int?>
|
||||
|
||||
/**
|
||||
* Convenience mapping calling {@link SelectField#convertFrom(Function)}.
|
||||
*/
|
||||
fun <U> mapping(from: (Int?, Double?, Double?, Double?, JudgementType?, Double?, Double?, Double?, Int?) -> U): SelectField<U> = convertFrom(Records.mapping(from))
|
||||
|
||||
/**
|
||||
* Convenience mapping calling {@link SelectField#convertFrom(Class,
|
||||
* Function)}.
|
||||
*/
|
||||
fun <U> mapping(toType: Class<U>, from: (Int?, Double?, Double?, Double?, JudgementType?, Double?, Double?, Double?, Int?) -> U): SelectField<U> = convertFrom(toType, Records.mapping(from))
|
||||
}
|
||||
@ -0,0 +1,179 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables
|
||||
|
||||
|
||||
import com.nisemoe.generated.Public
|
||||
import com.nisemoe.generated.indexes.IDX_REPLAY_IDS
|
||||
import com.nisemoe.generated.indexes.IDX_REPLAY_IDS_PAIRS
|
||||
import com.nisemoe.generated.keys.SCORES_SIMILARITY_PKEY
|
||||
import com.nisemoe.generated.keys.UNIQUE_BEATMAP_REPLAY_IDS
|
||||
import com.nisemoe.generated.tables.records.ScoresSimilarityRecord
|
||||
|
||||
import java.time.LocalDateTime
|
||||
import java.util.function.Function
|
||||
|
||||
import kotlin.collections.List
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.ForeignKey
|
||||
import org.jooq.Index
|
||||
import org.jooq.Name
|
||||
import org.jooq.Record
|
||||
import org.jooq.Records
|
||||
import org.jooq.Row10
|
||||
import org.jooq.Schema
|
||||
import org.jooq.SelectField
|
||||
import org.jooq.Table
|
||||
import org.jooq.TableField
|
||||
import org.jooq.TableOptions
|
||||
import org.jooq.UniqueKey
|
||||
import org.jooq.impl.DSL
|
||||
import org.jooq.impl.Internal
|
||||
import org.jooq.impl.SQLDataType
|
||||
import org.jooq.impl.TableImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class ScoresSimilarity(
|
||||
alias: Name,
|
||||
child: Table<out Record>?,
|
||||
path: ForeignKey<out Record, ScoresSimilarityRecord>?,
|
||||
aliased: Table<ScoresSimilarityRecord>?,
|
||||
parameters: Array<Field<*>?>?
|
||||
): TableImpl<ScoresSimilarityRecord>(
|
||||
alias,
|
||||
Public.PUBLIC,
|
||||
child,
|
||||
path,
|
||||
aliased,
|
||||
parameters,
|
||||
DSL.comment(""),
|
||||
TableOptions.table()
|
||||
) {
|
||||
companion object {
|
||||
|
||||
/**
|
||||
* The reference instance of <code>public.scores_similarity</code>
|
||||
*/
|
||||
val SCORES_SIMILARITY: ScoresSimilarity = ScoresSimilarity()
|
||||
}
|
||||
|
||||
/**
|
||||
* The class holding records for this type
|
||||
*/
|
||||
override fun getRecordType(): Class<ScoresSimilarityRecord> = ScoresSimilarityRecord::class.java
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_similarity.id</code>.
|
||||
*/
|
||||
val ID: TableField<ScoresSimilarityRecord, Int?> = createField(DSL.name("id"), SQLDataType.INTEGER.nullable(false).defaultValue(DSL.field(DSL.raw("nextval('scores_similarity_id_seq1'::regclass)"), SQLDataType.INTEGER)), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_similarity.beatmap_id</code>.
|
||||
*/
|
||||
val BEATMAP_ID: TableField<ScoresSimilarityRecord, Int?> = createField(DSL.name("beatmap_id"), SQLDataType.INTEGER, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_similarity.replay_id_1</code>.
|
||||
*/
|
||||
val REPLAY_ID_1: TableField<ScoresSimilarityRecord, Long?> = createField(DSL.name("replay_id_1"), SQLDataType.BIGINT, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_similarity.replay_id_2</code>.
|
||||
*/
|
||||
val REPLAY_ID_2: TableField<ScoresSimilarityRecord, Long?> = createField(DSL.name("replay_id_2"), SQLDataType.BIGINT, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_similarity.similarity</code>.
|
||||
*/
|
||||
val SIMILARITY: TableField<ScoresSimilarityRecord, Double?> = createField(DSL.name("similarity"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_similarity.correlation</code>.
|
||||
*/
|
||||
val CORRELATION: TableField<ScoresSimilarityRecord, Double?> = createField(DSL.name("correlation"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_similarity.created_at</code>.
|
||||
*/
|
||||
val CREATED_AT: TableField<ScoresSimilarityRecord, LocalDateTime?> = createField(DSL.name("created_at"), SQLDataType.LOCALDATETIME(6), this, "")
|
||||
|
||||
/**
|
||||
* The column
|
||||
* <code>public.scores_similarity.sent_discord_notification</code>.
|
||||
*/
|
||||
val SENT_DISCORD_NOTIFICATION: TableField<ScoresSimilarityRecord, Boolean?> = createField(DSL.name("sent_discord_notification"), SQLDataType.BOOLEAN, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_similarity.cg_similarity</code>.
|
||||
*/
|
||||
val CG_SIMILARITY: TableField<ScoresSimilarityRecord, Double?> = createField(DSL.name("cg_similarity"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.scores_similarity.cg_correlation</code>.
|
||||
*/
|
||||
val CG_CORRELATION: TableField<ScoresSimilarityRecord, Double?> = createField(DSL.name("cg_correlation"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
private constructor(alias: Name, aliased: Table<ScoresSimilarityRecord>?): this(alias, null, null, aliased, null)
|
||||
private constructor(alias: Name, aliased: Table<ScoresSimilarityRecord>?, parameters: Array<Field<*>?>?): this(alias, null, null, aliased, parameters)
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.scores_similarity</code> table reference
|
||||
*/
|
||||
constructor(alias: String): this(DSL.name(alias))
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.scores_similarity</code> table reference
|
||||
*/
|
||||
constructor(alias: Name): this(alias, null)
|
||||
|
||||
/**
|
||||
* Create a <code>public.scores_similarity</code> table reference
|
||||
*/
|
||||
constructor(): this(DSL.name("scores_similarity"), null)
|
||||
|
||||
constructor(child: Table<out Record>, key: ForeignKey<out Record, ScoresSimilarityRecord>): this(Internal.createPathAlias(child, key), child, key, SCORES_SIMILARITY, null)
|
||||
override fun getSchema(): Schema? = if (aliased()) null else Public.PUBLIC
|
||||
override fun getIndexes(): List<Index> = listOf(IDX_REPLAY_IDS, IDX_REPLAY_IDS_PAIRS)
|
||||
override fun getPrimaryKey(): UniqueKey<ScoresSimilarityRecord> = SCORES_SIMILARITY_PKEY
|
||||
override fun getUniqueKeys(): List<UniqueKey<ScoresSimilarityRecord>> = listOf(UNIQUE_BEATMAP_REPLAY_IDS)
|
||||
override fun `as`(alias: String): ScoresSimilarity = ScoresSimilarity(DSL.name(alias), this)
|
||||
override fun `as`(alias: Name): ScoresSimilarity = ScoresSimilarity(alias, this)
|
||||
override fun `as`(alias: Table<*>): ScoresSimilarity = ScoresSimilarity(alias.getQualifiedName(), this)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: String): ScoresSimilarity = ScoresSimilarity(DSL.name(name), null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Name): ScoresSimilarity = ScoresSimilarity(name, null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Table<*>): ScoresSimilarity = ScoresSimilarity(name.getQualifiedName(), null)
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Row10 type methods
|
||||
// -------------------------------------------------------------------------
|
||||
override fun fieldsRow(): Row10<Int?, Int?, Long?, Long?, Double?, Double?, LocalDateTime?, Boolean?, Double?, Double?> = super.fieldsRow() as Row10<Int?, Int?, Long?, Long?, Double?, Double?, LocalDateTime?, Boolean?, Double?, Double?>
|
||||
|
||||
/**
|
||||
* Convenience mapping calling {@link SelectField#convertFrom(Function)}.
|
||||
*/
|
||||
fun <U> mapping(from: (Int?, Int?, Long?, Long?, Double?, Double?, LocalDateTime?, Boolean?, Double?, Double?) -> U): SelectField<U> = convertFrom(Records.mapping(from))
|
||||
|
||||
/**
|
||||
* Convenience mapping calling {@link SelectField#convertFrom(Class,
|
||||
* Function)}.
|
||||
*/
|
||||
fun <U> mapping(toType: Class<U>, from: (Int?, Int?, Long?, Long?, Double?, Double?, LocalDateTime?, Boolean?, Double?, Double?) -> U): SelectField<U> = convertFrom(toType, Records.mapping(from))
|
||||
}
|
||||
@ -0,0 +1,147 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables
|
||||
|
||||
|
||||
import com.nisemoe.generated.Public
|
||||
import com.nisemoe.generated.keys.UPDATE_USER_QUEUE_PKEY
|
||||
import com.nisemoe.generated.tables.records.UpdateUserQueueRecord
|
||||
|
||||
import java.time.LocalDateTime
|
||||
import java.util.function.Function
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.ForeignKey
|
||||
import org.jooq.Identity
|
||||
import org.jooq.Name
|
||||
import org.jooq.Record
|
||||
import org.jooq.Records
|
||||
import org.jooq.Row5
|
||||
import org.jooq.Schema
|
||||
import org.jooq.SelectField
|
||||
import org.jooq.Table
|
||||
import org.jooq.TableField
|
||||
import org.jooq.TableOptions
|
||||
import org.jooq.UniqueKey
|
||||
import org.jooq.impl.DSL
|
||||
import org.jooq.impl.Internal
|
||||
import org.jooq.impl.SQLDataType
|
||||
import org.jooq.impl.TableImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class UpdateUserQueue(
|
||||
alias: Name,
|
||||
child: Table<out Record>?,
|
||||
path: ForeignKey<out Record, UpdateUserQueueRecord>?,
|
||||
aliased: Table<UpdateUserQueueRecord>?,
|
||||
parameters: Array<Field<*>?>?
|
||||
): TableImpl<UpdateUserQueueRecord>(
|
||||
alias,
|
||||
Public.PUBLIC,
|
||||
child,
|
||||
path,
|
||||
aliased,
|
||||
parameters,
|
||||
DSL.comment(""),
|
||||
TableOptions.table()
|
||||
) {
|
||||
companion object {
|
||||
|
||||
/**
|
||||
* The reference instance of <code>public.update_user_queue</code>
|
||||
*/
|
||||
val UPDATE_USER_QUEUE: UpdateUserQueue = UpdateUserQueue()
|
||||
}
|
||||
|
||||
/**
|
||||
* The class holding records for this type
|
||||
*/
|
||||
override fun getRecordType(): Class<UpdateUserQueueRecord> = UpdateUserQueueRecord::class.java
|
||||
|
||||
/**
|
||||
* The column <code>public.update_user_queue.id</code>.
|
||||
*/
|
||||
val ID: TableField<UpdateUserQueueRecord, Int?> = createField(DSL.name("id"), SQLDataType.INTEGER.nullable(false).identity(true), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.update_user_queue.user_id</code>.
|
||||
*/
|
||||
val USER_ID: TableField<UpdateUserQueueRecord, Long?> = createField(DSL.name("user_id"), SQLDataType.BIGINT.nullable(false), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.update_user_queue.processed</code>.
|
||||
*/
|
||||
val PROCESSED: TableField<UpdateUserQueueRecord, Boolean?> = createField(DSL.name("processed"), SQLDataType.BOOLEAN.nullable(false).defaultValue(DSL.field(DSL.raw("false"), SQLDataType.BOOLEAN)), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.update_user_queue.created_at</code>.
|
||||
*/
|
||||
val CREATED_AT: TableField<UpdateUserQueueRecord, LocalDateTime?> = createField(DSL.name("created_at"), SQLDataType.LOCALDATETIME(6).nullable(false).defaultValue(DSL.field(DSL.raw("CURRENT_TIMESTAMP"), SQLDataType.LOCALDATETIME)), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.update_user_queue.processed_at</code>.
|
||||
*/
|
||||
val PROCESSED_AT: TableField<UpdateUserQueueRecord, LocalDateTime?> = createField(DSL.name("processed_at"), SQLDataType.LOCALDATETIME(6), this, "")
|
||||
|
||||
private constructor(alias: Name, aliased: Table<UpdateUserQueueRecord>?): this(alias, null, null, aliased, null)
|
||||
private constructor(alias: Name, aliased: Table<UpdateUserQueueRecord>?, parameters: Array<Field<*>?>?): this(alias, null, null, aliased, parameters)
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.update_user_queue</code> table reference
|
||||
*/
|
||||
constructor(alias: String): this(DSL.name(alias))
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.update_user_queue</code> table reference
|
||||
*/
|
||||
constructor(alias: Name): this(alias, null)
|
||||
|
||||
/**
|
||||
* Create a <code>public.update_user_queue</code> table reference
|
||||
*/
|
||||
constructor(): this(DSL.name("update_user_queue"), null)
|
||||
|
||||
constructor(child: Table<out Record>, key: ForeignKey<out Record, UpdateUserQueueRecord>): this(Internal.createPathAlias(child, key), child, key, UPDATE_USER_QUEUE, null)
|
||||
override fun getSchema(): Schema? = if (aliased()) null else Public.PUBLIC
|
||||
override fun getIdentity(): Identity<UpdateUserQueueRecord, Int?> = super.getIdentity() as Identity<UpdateUserQueueRecord, Int?>
|
||||
override fun getPrimaryKey(): UniqueKey<UpdateUserQueueRecord> = UPDATE_USER_QUEUE_PKEY
|
||||
override fun `as`(alias: String): UpdateUserQueue = UpdateUserQueue(DSL.name(alias), this)
|
||||
override fun `as`(alias: Name): UpdateUserQueue = UpdateUserQueue(alias, this)
|
||||
override fun `as`(alias: Table<*>): UpdateUserQueue = UpdateUserQueue(alias.getQualifiedName(), this)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: String): UpdateUserQueue = UpdateUserQueue(DSL.name(name), null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Name): UpdateUserQueue = UpdateUserQueue(name, null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Table<*>): UpdateUserQueue = UpdateUserQueue(name.getQualifiedName(), null)
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Row5 type methods
|
||||
// -------------------------------------------------------------------------
|
||||
override fun fieldsRow(): Row5<Int?, Long?, Boolean?, LocalDateTime?, LocalDateTime?> = super.fieldsRow() as Row5<Int?, Long?, Boolean?, LocalDateTime?, LocalDateTime?>
|
||||
|
||||
/**
|
||||
* Convenience mapping calling {@link SelectField#convertFrom(Function)}.
|
||||
*/
|
||||
fun <U> mapping(from: (Int?, Long?, Boolean?, LocalDateTime?, LocalDateTime?) -> U): SelectField<U> = convertFrom(Records.mapping(from))
|
||||
|
||||
/**
|
||||
* Convenience mapping calling {@link SelectField#convertFrom(Class,
|
||||
* Function)}.
|
||||
*/
|
||||
fun <U> mapping(toType: Class<U>, from: (Int?, Long?, Boolean?, LocalDateTime?, LocalDateTime?) -> U): SelectField<U> = convertFrom(toType, Records.mapping(from))
|
||||
}
|
||||
@ -0,0 +1,200 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables
|
||||
|
||||
|
||||
import com.nisemoe.generated.Public
|
||||
import com.nisemoe.generated.keys.USERS_PKEY
|
||||
import com.nisemoe.generated.tables.records.UsersRecord
|
||||
|
||||
import java.time.LocalDateTime
|
||||
import java.util.function.Function
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.ForeignKey
|
||||
import org.jooq.Name
|
||||
import org.jooq.Record
|
||||
import org.jooq.Records
|
||||
import org.jooq.Row16
|
||||
import org.jooq.Schema
|
||||
import org.jooq.SelectField
|
||||
import org.jooq.Table
|
||||
import org.jooq.TableField
|
||||
import org.jooq.TableOptions
|
||||
import org.jooq.UniqueKey
|
||||
import org.jooq.impl.DSL
|
||||
import org.jooq.impl.Internal
|
||||
import org.jooq.impl.SQLDataType
|
||||
import org.jooq.impl.TableImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class Users(
|
||||
alias: Name,
|
||||
child: Table<out Record>?,
|
||||
path: ForeignKey<out Record, UsersRecord>?,
|
||||
aliased: Table<UsersRecord>?,
|
||||
parameters: Array<Field<*>?>?
|
||||
): TableImpl<UsersRecord>(
|
||||
alias,
|
||||
Public.PUBLIC,
|
||||
child,
|
||||
path,
|
||||
aliased,
|
||||
parameters,
|
||||
DSL.comment(""),
|
||||
TableOptions.table()
|
||||
) {
|
||||
companion object {
|
||||
|
||||
/**
|
||||
* The reference instance of <code>public.users</code>
|
||||
*/
|
||||
val USERS: Users = Users()
|
||||
}
|
||||
|
||||
/**
|
||||
* The class holding records for this type
|
||||
*/
|
||||
override fun getRecordType(): Class<UsersRecord> = UsersRecord::class.java
|
||||
|
||||
/**
|
||||
* The column <code>public.users.user_id</code>.
|
||||
*/
|
||||
val USER_ID: TableField<UsersRecord, Long?> = createField(DSL.name("user_id"), SQLDataType.BIGINT.nullable(false).defaultValue(DSL.field(DSL.raw("nextval('users_user_id_seq1'::regclass)"), SQLDataType.BIGINT)), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.username</code>.
|
||||
*/
|
||||
val USERNAME: TableField<UsersRecord, String?> = createField(DSL.name("username"), SQLDataType.VARCHAR, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.join_date</code>.
|
||||
*/
|
||||
val JOIN_DATE: TableField<UsersRecord, LocalDateTime?> = createField(DSL.name("join_date"), SQLDataType.LOCALDATETIME(6), this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.country</code>.
|
||||
*/
|
||||
val COUNTRY: TableField<UsersRecord, String?> = createField(DSL.name("country"), SQLDataType.VARCHAR, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.country_rank</code>.
|
||||
*/
|
||||
val COUNTRY_RANK: TableField<UsersRecord, Long?> = createField(DSL.name("country_rank"), SQLDataType.BIGINT, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.rank</code>.
|
||||
*/
|
||||
val RANK: TableField<UsersRecord, Long?> = createField(DSL.name("rank"), SQLDataType.BIGINT, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.pp_raw</code>.
|
||||
*/
|
||||
val PP_RAW: TableField<UsersRecord, Double?> = createField(DSL.name("pp_raw"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.accuracy</code>.
|
||||
*/
|
||||
val ACCURACY: TableField<UsersRecord, Double?> = createField(DSL.name("accuracy"), SQLDataType.DOUBLE, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.playcount</code>.
|
||||
*/
|
||||
val PLAYCOUNT: TableField<UsersRecord, Long?> = createField(DSL.name("playcount"), SQLDataType.BIGINT, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.total_score</code>.
|
||||
*/
|
||||
val TOTAL_SCORE: TableField<UsersRecord, Long?> = createField(DSL.name("total_score"), SQLDataType.BIGINT, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.ranked_score</code>.
|
||||
*/
|
||||
val RANKED_SCORE: TableField<UsersRecord, Long?> = createField(DSL.name("ranked_score"), SQLDataType.BIGINT, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.seconds_played</code>.
|
||||
*/
|
||||
val SECONDS_PLAYED: TableField<UsersRecord, Long?> = createField(DSL.name("seconds_played"), SQLDataType.BIGINT, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.count_100</code>.
|
||||
*/
|
||||
val COUNT_100: TableField<UsersRecord, Long?> = createField(DSL.name("count_100"), SQLDataType.BIGINT, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.count_300</code>.
|
||||
*/
|
||||
val COUNT_300: TableField<UsersRecord, Long?> = createField(DSL.name("count_300"), SQLDataType.BIGINT, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.count_50</code>.
|
||||
*/
|
||||
val COUNT_50: TableField<UsersRecord, Long?> = createField(DSL.name("count_50"), SQLDataType.BIGINT, this, "")
|
||||
|
||||
/**
|
||||
* The column <code>public.users.sys_last_update</code>.
|
||||
*/
|
||||
val SYS_LAST_UPDATE: TableField<UsersRecord, LocalDateTime?> = createField(DSL.name("sys_last_update"), SQLDataType.LOCALDATETIME(6), this, "")
|
||||
|
||||
private constructor(alias: Name, aliased: Table<UsersRecord>?): this(alias, null, null, aliased, null)
|
||||
private constructor(alias: Name, aliased: Table<UsersRecord>?, parameters: Array<Field<*>?>?): this(alias, null, null, aliased, parameters)
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.users</code> table reference
|
||||
*/
|
||||
constructor(alias: String): this(DSL.name(alias))
|
||||
|
||||
/**
|
||||
* Create an aliased <code>public.users</code> table reference
|
||||
*/
|
||||
constructor(alias: Name): this(alias, null)
|
||||
|
||||
/**
|
||||
* Create a <code>public.users</code> table reference
|
||||
*/
|
||||
constructor(): this(DSL.name("users"), null)
|
||||
|
||||
constructor(child: Table<out Record>, key: ForeignKey<out Record, UsersRecord>): this(Internal.createPathAlias(child, key), child, key, USERS, null)
|
||||
override fun getSchema(): Schema? = if (aliased()) null else Public.PUBLIC
|
||||
override fun getPrimaryKey(): UniqueKey<UsersRecord> = USERS_PKEY
|
||||
override fun `as`(alias: String): Users = Users(DSL.name(alias), this)
|
||||
override fun `as`(alias: Name): Users = Users(alias, this)
|
||||
override fun `as`(alias: Table<*>): Users = Users(alias.getQualifiedName(), this)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: String): Users = Users(DSL.name(name), null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Name): Users = Users(name, null)
|
||||
|
||||
/**
|
||||
* Rename this table
|
||||
*/
|
||||
override fun rename(name: Table<*>): Users = Users(name.getQualifiedName(), null)
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Row16 type methods
|
||||
// -------------------------------------------------------------------------
|
||||
override fun fieldsRow(): Row16<Long?, String?, LocalDateTime?, String?, Long?, Long?, Double?, Double?, Long?, Long?, Long?, Long?, Long?, Long?, Long?, LocalDateTime?> = super.fieldsRow() as Row16<Long?, String?, LocalDateTime?, String?, Long?, Long?, Double?, Double?, Long?, Long?, Long?, Long?, Long?, Long?, Long?, LocalDateTime?>
|
||||
|
||||
/**
|
||||
* Convenience mapping calling {@link SelectField#convertFrom(Function)}.
|
||||
*/
|
||||
fun <U> mapping(from: (Long?, String?, LocalDateTime?, String?, Long?, Long?, Double?, Double?, Long?, Long?, Long?, Long?, Long?, Long?, Long?, LocalDateTime?) -> U): SelectField<U> = convertFrom(Records.mapping(from))
|
||||
|
||||
/**
|
||||
* Convenience mapping calling {@link SelectField#convertFrom(Class,
|
||||
* Function)}.
|
||||
*/
|
||||
fun <U> mapping(toType: Class<U>, from: (Long?, String?, LocalDateTime?, String?, Long?, Long?, Double?, Double?, Long?, Long?, Long?, Long?, Long?, Long?, Long?, LocalDateTime?) -> U): SelectField<U> = convertFrom(toType, Records.mapping(from))
|
||||
}
|
||||
@ -0,0 +1,173 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables.records
|
||||
|
||||
|
||||
import com.nisemoe.generated.tables.Beatmaps
|
||||
|
||||
import java.time.LocalDateTime
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.Record1
|
||||
import org.jooq.Record9
|
||||
import org.jooq.Row9
|
||||
import org.jooq.impl.UpdatableRecordImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class BeatmapsRecord private constructor() : UpdatableRecordImpl<BeatmapsRecord>(Beatmaps.BEATMAPS), Record9<Int?, String?, Int?, String?, String?, Double?, String?, String?, LocalDateTime?> {
|
||||
|
||||
open var beatmapId: Int?
|
||||
set(value): Unit = set(0, value)
|
||||
get(): Int? = get(0) as Int?
|
||||
|
||||
open var artist: String?
|
||||
set(value): Unit = set(1, value)
|
||||
get(): String? = get(1) as String?
|
||||
|
||||
open var beatmapsetId: Int?
|
||||
set(value): Unit = set(2, value)
|
||||
get(): Int? = get(2) as Int?
|
||||
|
||||
open var creator: String?
|
||||
set(value): Unit = set(3, value)
|
||||
get(): String? = get(3) as String?
|
||||
|
||||
open var source: String?
|
||||
set(value): Unit = set(4, value)
|
||||
get(): String? = get(4) as String?
|
||||
|
||||
open var starRating: Double?
|
||||
set(value): Unit = set(5, value)
|
||||
get(): Double? = get(5) as Double?
|
||||
|
||||
open var title: String?
|
||||
set(value): Unit = set(6, value)
|
||||
get(): String? = get(6) as String?
|
||||
|
||||
open var version: String?
|
||||
set(value): Unit = set(7, value)
|
||||
get(): String? = get(7) as String?
|
||||
|
||||
open var sysLastUpdate: LocalDateTime?
|
||||
set(value): Unit = set(8, value)
|
||||
get(): LocalDateTime? = get(8) as LocalDateTime?
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Primary key information
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun key(): Record1<Int?> = super.key() as Record1<Int?>
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Record9 type implementation
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun fieldsRow(): Row9<Int?, String?, Int?, String?, String?, Double?, String?, String?, LocalDateTime?> = super.fieldsRow() as Row9<Int?, String?, Int?, String?, String?, Double?, String?, String?, LocalDateTime?>
|
||||
override fun valuesRow(): Row9<Int?, String?, Int?, String?, String?, Double?, String?, String?, LocalDateTime?> = super.valuesRow() as Row9<Int?, String?, Int?, String?, String?, Double?, String?, String?, LocalDateTime?>
|
||||
override fun field1(): Field<Int?> = Beatmaps.BEATMAPS.BEATMAP_ID
|
||||
override fun field2(): Field<String?> = Beatmaps.BEATMAPS.ARTIST
|
||||
override fun field3(): Field<Int?> = Beatmaps.BEATMAPS.BEATMAPSET_ID
|
||||
override fun field4(): Field<String?> = Beatmaps.BEATMAPS.CREATOR
|
||||
override fun field5(): Field<String?> = Beatmaps.BEATMAPS.SOURCE
|
||||
override fun field6(): Field<Double?> = Beatmaps.BEATMAPS.STAR_RATING
|
||||
override fun field7(): Field<String?> = Beatmaps.BEATMAPS.TITLE
|
||||
override fun field8(): Field<String?> = Beatmaps.BEATMAPS.VERSION
|
||||
override fun field9(): Field<LocalDateTime?> = Beatmaps.BEATMAPS.SYS_LAST_UPDATE
|
||||
override fun component1(): Int? = beatmapId
|
||||
override fun component2(): String? = artist
|
||||
override fun component3(): Int? = beatmapsetId
|
||||
override fun component4(): String? = creator
|
||||
override fun component5(): String? = source
|
||||
override fun component6(): Double? = starRating
|
||||
override fun component7(): String? = title
|
||||
override fun component8(): String? = version
|
||||
override fun component9(): LocalDateTime? = sysLastUpdate
|
||||
override fun value1(): Int? = beatmapId
|
||||
override fun value2(): String? = artist
|
||||
override fun value3(): Int? = beatmapsetId
|
||||
override fun value4(): String? = creator
|
||||
override fun value5(): String? = source
|
||||
override fun value6(): Double? = starRating
|
||||
override fun value7(): String? = title
|
||||
override fun value8(): String? = version
|
||||
override fun value9(): LocalDateTime? = sysLastUpdate
|
||||
|
||||
override fun value1(value: Int?): BeatmapsRecord {
|
||||
set(0, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value2(value: String?): BeatmapsRecord {
|
||||
set(1, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value3(value: Int?): BeatmapsRecord {
|
||||
set(2, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value4(value: String?): BeatmapsRecord {
|
||||
set(3, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value5(value: String?): BeatmapsRecord {
|
||||
set(4, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value6(value: Double?): BeatmapsRecord {
|
||||
set(5, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value7(value: String?): BeatmapsRecord {
|
||||
set(6, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value8(value: String?): BeatmapsRecord {
|
||||
set(7, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value9(value: LocalDateTime?): BeatmapsRecord {
|
||||
set(8, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun values(value1: Int?, value2: String?, value3: Int?, value4: String?, value5: String?, value6: Double?, value7: String?, value8: String?, value9: LocalDateTime?): BeatmapsRecord {
|
||||
this.value1(value1)
|
||||
this.value2(value2)
|
||||
this.value3(value3)
|
||||
this.value4(value4)
|
||||
this.value5(value5)
|
||||
this.value6(value6)
|
||||
this.value7(value7)
|
||||
this.value8(value8)
|
||||
this.value9(value9)
|
||||
return this
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a detached, initialised BeatmapsRecord
|
||||
*/
|
||||
constructor(beatmapId: Int? = null, artist: String? = null, beatmapsetId: Int? = null, creator: String? = null, source: String? = null, starRating: Double? = null, title: String? = null, version: String? = null, sysLastUpdate: LocalDateTime? = null): this() {
|
||||
this.beatmapId = beatmapId
|
||||
this.artist = artist
|
||||
this.beatmapsetId = beatmapsetId
|
||||
this.creator = creator
|
||||
this.source = source
|
||||
this.starRating = starRating
|
||||
this.title = title
|
||||
this.version = version
|
||||
this.sysLastUpdate = sysLastUpdate
|
||||
resetChangedOnNotNull()
|
||||
}
|
||||
}
|
||||
@ -0,0 +1,187 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables.records
|
||||
|
||||
|
||||
import com.nisemoe.generated.tables.FlywaySchemaHistory
|
||||
|
||||
import java.time.LocalDateTime
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.Record1
|
||||
import org.jooq.Record10
|
||||
import org.jooq.Row10
|
||||
import org.jooq.impl.UpdatableRecordImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class FlywaySchemaHistoryRecord private constructor() : UpdatableRecordImpl<FlywaySchemaHistoryRecord>(FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY), Record10<Int?, String?, String?, String?, String?, Int?, String?, LocalDateTime?, Int?, Boolean?> {
|
||||
|
||||
open var installedRank: Int
|
||||
set(value): Unit = set(0, value)
|
||||
get(): Int = get(0) as Int
|
||||
|
||||
open var version: String?
|
||||
set(value): Unit = set(1, value)
|
||||
get(): String? = get(1) as String?
|
||||
|
||||
open var description: String
|
||||
set(value): Unit = set(2, value)
|
||||
get(): String = get(2) as String
|
||||
|
||||
open var type: String
|
||||
set(value): Unit = set(3, value)
|
||||
get(): String = get(3) as String
|
||||
|
||||
open var script: String
|
||||
set(value): Unit = set(4, value)
|
||||
get(): String = get(4) as String
|
||||
|
||||
open var checksum: Int?
|
||||
set(value): Unit = set(5, value)
|
||||
get(): Int? = get(5) as Int?
|
||||
|
||||
open var installedBy: String
|
||||
set(value): Unit = set(6, value)
|
||||
get(): String = get(6) as String
|
||||
|
||||
open var installedOn: LocalDateTime?
|
||||
set(value): Unit = set(7, value)
|
||||
get(): LocalDateTime? = get(7) as LocalDateTime?
|
||||
|
||||
open var executionTime: Int
|
||||
set(value): Unit = set(8, value)
|
||||
get(): Int = get(8) as Int
|
||||
|
||||
open var success: Boolean
|
||||
set(value): Unit = set(9, value)
|
||||
get(): Boolean = get(9) as Boolean
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Primary key information
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun key(): Record1<Int?> = super.key() as Record1<Int?>
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Record10 type implementation
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun fieldsRow(): Row10<Int?, String?, String?, String?, String?, Int?, String?, LocalDateTime?, Int?, Boolean?> = super.fieldsRow() as Row10<Int?, String?, String?, String?, String?, Int?, String?, LocalDateTime?, Int?, Boolean?>
|
||||
override fun valuesRow(): Row10<Int?, String?, String?, String?, String?, Int?, String?, LocalDateTime?, Int?, Boolean?> = super.valuesRow() as Row10<Int?, String?, String?, String?, String?, Int?, String?, LocalDateTime?, Int?, Boolean?>
|
||||
override fun field1(): Field<Int?> = FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY.INSTALLED_RANK
|
||||
override fun field2(): Field<String?> = FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY.VERSION
|
||||
override fun field3(): Field<String?> = FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY.DESCRIPTION
|
||||
override fun field4(): Field<String?> = FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY.TYPE
|
||||
override fun field5(): Field<String?> = FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY.SCRIPT
|
||||
override fun field6(): Field<Int?> = FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY.CHECKSUM
|
||||
override fun field7(): Field<String?> = FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY.INSTALLED_BY
|
||||
override fun field8(): Field<LocalDateTime?> = FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY.INSTALLED_ON
|
||||
override fun field9(): Field<Int?> = FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY.EXECUTION_TIME
|
||||
override fun field10(): Field<Boolean?> = FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY.SUCCESS
|
||||
override fun component1(): Int = installedRank
|
||||
override fun component2(): String? = version
|
||||
override fun component3(): String = description
|
||||
override fun component4(): String = type
|
||||
override fun component5(): String = script
|
||||
override fun component6(): Int? = checksum
|
||||
override fun component7(): String = installedBy
|
||||
override fun component8(): LocalDateTime? = installedOn
|
||||
override fun component9(): Int = executionTime
|
||||
override fun component10(): Boolean = success
|
||||
override fun value1(): Int = installedRank
|
||||
override fun value2(): String? = version
|
||||
override fun value3(): String = description
|
||||
override fun value4(): String = type
|
||||
override fun value5(): String = script
|
||||
override fun value6(): Int? = checksum
|
||||
override fun value7(): String = installedBy
|
||||
override fun value8(): LocalDateTime? = installedOn
|
||||
override fun value9(): Int = executionTime
|
||||
override fun value10(): Boolean = success
|
||||
|
||||
override fun value1(value: Int?): FlywaySchemaHistoryRecord {
|
||||
set(0, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value2(value: String?): FlywaySchemaHistoryRecord {
|
||||
set(1, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value3(value: String?): FlywaySchemaHistoryRecord {
|
||||
set(2, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value4(value: String?): FlywaySchemaHistoryRecord {
|
||||
set(3, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value5(value: String?): FlywaySchemaHistoryRecord {
|
||||
set(4, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value6(value: Int?): FlywaySchemaHistoryRecord {
|
||||
set(5, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value7(value: String?): FlywaySchemaHistoryRecord {
|
||||
set(6, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value8(value: LocalDateTime?): FlywaySchemaHistoryRecord {
|
||||
set(7, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value9(value: Int?): FlywaySchemaHistoryRecord {
|
||||
set(8, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value10(value: Boolean?): FlywaySchemaHistoryRecord {
|
||||
set(9, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun values(value1: Int?, value2: String?, value3: String?, value4: String?, value5: String?, value6: Int?, value7: String?, value8: LocalDateTime?, value9: Int?, value10: Boolean?): FlywaySchemaHistoryRecord {
|
||||
this.value1(value1)
|
||||
this.value2(value2)
|
||||
this.value3(value3)
|
||||
this.value4(value4)
|
||||
this.value5(value5)
|
||||
this.value6(value6)
|
||||
this.value7(value7)
|
||||
this.value8(value8)
|
||||
this.value9(value9)
|
||||
this.value10(value10)
|
||||
return this
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a detached, initialised FlywaySchemaHistoryRecord
|
||||
*/
|
||||
constructor(installedRank: Int, version: String? = null, description: String, type: String, script: String, checksum: Int? = null, installedBy: String, installedOn: LocalDateTime? = null, executionTime: Int, success: Boolean): this() {
|
||||
this.installedRank = installedRank
|
||||
this.version = version
|
||||
this.description = description
|
||||
this.type = type
|
||||
this.script = script
|
||||
this.checksum = checksum
|
||||
this.installedBy = installedBy
|
||||
this.installedOn = installedOn
|
||||
this.executionTime = executionTime
|
||||
this.success = success
|
||||
resetChangedOnNotNull()
|
||||
}
|
||||
}
|
||||
@ -0,0 +1,117 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables.records
|
||||
|
||||
|
||||
import com.nisemoe.generated.tables.RedditPost
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.Record1
|
||||
import org.jooq.Record5
|
||||
import org.jooq.Row5
|
||||
import org.jooq.impl.UpdatableRecordImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class RedditPostRecord private constructor() : UpdatableRecordImpl<RedditPostRecord>(RedditPost.REDDIT_POST), Record5<String?, String?, Double?, String?, Boolean?> {
|
||||
|
||||
open var postId: String
|
||||
set(value): Unit = set(0, value)
|
||||
get(): String = get(0) as String
|
||||
|
||||
open var title: String?
|
||||
set(value): Unit = set(1, value)
|
||||
get(): String? = get(1) as String?
|
||||
|
||||
open var createdUtc: Double?
|
||||
set(value): Unit = set(2, value)
|
||||
get(): Double? = get(2) as Double?
|
||||
|
||||
open var url: String?
|
||||
set(value): Unit = set(3, value)
|
||||
get(): String? = get(3) as String?
|
||||
|
||||
@Suppress("INAPPLICABLE_JVM_NAME")
|
||||
@set:JvmName("setIsChecked")
|
||||
open var isChecked: Boolean?
|
||||
set(value): Unit = set(4, value)
|
||||
get(): Boolean? = get(4) as Boolean?
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Primary key information
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun key(): Record1<String?> = super.key() as Record1<String?>
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Record5 type implementation
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun fieldsRow(): Row5<String?, String?, Double?, String?, Boolean?> = super.fieldsRow() as Row5<String?, String?, Double?, String?, Boolean?>
|
||||
override fun valuesRow(): Row5<String?, String?, Double?, String?, Boolean?> = super.valuesRow() as Row5<String?, String?, Double?, String?, Boolean?>
|
||||
override fun field1(): Field<String?> = RedditPost.REDDIT_POST.POST_ID
|
||||
override fun field2(): Field<String?> = RedditPost.REDDIT_POST.TITLE
|
||||
override fun field3(): Field<Double?> = RedditPost.REDDIT_POST.CREATED_UTC
|
||||
override fun field4(): Field<String?> = RedditPost.REDDIT_POST.URL
|
||||
override fun field5(): Field<Boolean?> = RedditPost.REDDIT_POST.IS_CHECKED
|
||||
override fun component1(): String = postId
|
||||
override fun component2(): String? = title
|
||||
override fun component3(): Double? = createdUtc
|
||||
override fun component4(): String? = url
|
||||
override fun component5(): Boolean? = isChecked
|
||||
override fun value1(): String = postId
|
||||
override fun value2(): String? = title
|
||||
override fun value3(): Double? = createdUtc
|
||||
override fun value4(): String? = url
|
||||
override fun value5(): Boolean? = isChecked
|
||||
|
||||
override fun value1(value: String?): RedditPostRecord {
|
||||
set(0, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value2(value: String?): RedditPostRecord {
|
||||
set(1, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value3(value: Double?): RedditPostRecord {
|
||||
set(2, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value4(value: String?): RedditPostRecord {
|
||||
set(3, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value5(value: Boolean?): RedditPostRecord {
|
||||
set(4, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun values(value1: String?, value2: String?, value3: Double?, value4: String?, value5: Boolean?): RedditPostRecord {
|
||||
this.value1(value1)
|
||||
this.value2(value2)
|
||||
this.value3(value3)
|
||||
this.value4(value4)
|
||||
this.value5(value5)
|
||||
return this
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a detached, initialised RedditPostRecord
|
||||
*/
|
||||
constructor(postId: String, title: String? = null, createdUtc: Double? = null, url: String? = null, isChecked: Boolean? = null): this() {
|
||||
this.postId = postId
|
||||
this.title = title
|
||||
this.createdUtc = createdUtc
|
||||
this.url = url
|
||||
this.isChecked = isChecked
|
||||
resetChangedOnNotNull()
|
||||
}
|
||||
}
|
||||
@ -0,0 +1,172 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables.records
|
||||
|
||||
|
||||
import com.nisemoe.generated.enums.JudgementType
|
||||
import com.nisemoe.generated.tables.ScoresJudgements
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.Record1
|
||||
import org.jooq.Record9
|
||||
import org.jooq.Row9
|
||||
import org.jooq.impl.UpdatableRecordImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class ScoresJudgementsRecord private constructor() : UpdatableRecordImpl<ScoresJudgementsRecord>(ScoresJudgements.SCORES_JUDGEMENTS), Record9<Int?, Double?, Double?, Double?, JudgementType?, Double?, Double?, Double?, Int?> {
|
||||
|
||||
open var id: Int?
|
||||
set(value): Unit = set(0, value)
|
||||
get(): Int? = get(0) as Int?
|
||||
|
||||
open var time: Double?
|
||||
set(value): Unit = set(1, value)
|
||||
get(): Double? = get(1) as Double?
|
||||
|
||||
open var x: Double?
|
||||
set(value): Unit = set(2, value)
|
||||
get(): Double? = get(2) as Double?
|
||||
|
||||
open var y: Double?
|
||||
set(value): Unit = set(3, value)
|
||||
get(): Double? = get(3) as Double?
|
||||
|
||||
open var type: JudgementType?
|
||||
set(value): Unit = set(4, value)
|
||||
get(): JudgementType? = get(4) as JudgementType?
|
||||
|
||||
open var distanceCenter: Double?
|
||||
set(value): Unit = set(5, value)
|
||||
get(): Double? = get(5) as Double?
|
||||
|
||||
open var distanceEdge: Double?
|
||||
set(value): Unit = set(6, value)
|
||||
get(): Double? = get(6) as Double?
|
||||
|
||||
open var error: Double?
|
||||
set(value): Unit = set(7, value)
|
||||
get(): Double? = get(7) as Double?
|
||||
|
||||
open var scoreId: Int?
|
||||
set(value): Unit = set(8, value)
|
||||
get(): Int? = get(8) as Int?
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Primary key information
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun key(): Record1<Int?> = super.key() as Record1<Int?>
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Record9 type implementation
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun fieldsRow(): Row9<Int?, Double?, Double?, Double?, JudgementType?, Double?, Double?, Double?, Int?> = super.fieldsRow() as Row9<Int?, Double?, Double?, Double?, JudgementType?, Double?, Double?, Double?, Int?>
|
||||
override fun valuesRow(): Row9<Int?, Double?, Double?, Double?, JudgementType?, Double?, Double?, Double?, Int?> = super.valuesRow() as Row9<Int?, Double?, Double?, Double?, JudgementType?, Double?, Double?, Double?, Int?>
|
||||
override fun field1(): Field<Int?> = ScoresJudgements.SCORES_JUDGEMENTS.ID
|
||||
override fun field2(): Field<Double?> = ScoresJudgements.SCORES_JUDGEMENTS.TIME
|
||||
override fun field3(): Field<Double?> = ScoresJudgements.SCORES_JUDGEMENTS.X
|
||||
override fun field4(): Field<Double?> = ScoresJudgements.SCORES_JUDGEMENTS.Y
|
||||
override fun field5(): Field<JudgementType?> = ScoresJudgements.SCORES_JUDGEMENTS.TYPE
|
||||
override fun field6(): Field<Double?> = ScoresJudgements.SCORES_JUDGEMENTS.DISTANCE_CENTER
|
||||
override fun field7(): Field<Double?> = ScoresJudgements.SCORES_JUDGEMENTS.DISTANCE_EDGE
|
||||
override fun field8(): Field<Double?> = ScoresJudgements.SCORES_JUDGEMENTS.ERROR
|
||||
override fun field9(): Field<Int?> = ScoresJudgements.SCORES_JUDGEMENTS.SCORE_ID
|
||||
override fun component1(): Int? = id
|
||||
override fun component2(): Double? = time
|
||||
override fun component3(): Double? = x
|
||||
override fun component4(): Double? = y
|
||||
override fun component5(): JudgementType? = type
|
||||
override fun component6(): Double? = distanceCenter
|
||||
override fun component7(): Double? = distanceEdge
|
||||
override fun component8(): Double? = error
|
||||
override fun component9(): Int? = scoreId
|
||||
override fun value1(): Int? = id
|
||||
override fun value2(): Double? = time
|
||||
override fun value3(): Double? = x
|
||||
override fun value4(): Double? = y
|
||||
override fun value5(): JudgementType? = type
|
||||
override fun value6(): Double? = distanceCenter
|
||||
override fun value7(): Double? = distanceEdge
|
||||
override fun value8(): Double? = error
|
||||
override fun value9(): Int? = scoreId
|
||||
|
||||
override fun value1(value: Int?): ScoresJudgementsRecord {
|
||||
set(0, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value2(value: Double?): ScoresJudgementsRecord {
|
||||
set(1, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value3(value: Double?): ScoresJudgementsRecord {
|
||||
set(2, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value4(value: Double?): ScoresJudgementsRecord {
|
||||
set(3, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value5(value: JudgementType?): ScoresJudgementsRecord {
|
||||
set(4, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value6(value: Double?): ScoresJudgementsRecord {
|
||||
set(5, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value7(value: Double?): ScoresJudgementsRecord {
|
||||
set(6, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value8(value: Double?): ScoresJudgementsRecord {
|
||||
set(7, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value9(value: Int?): ScoresJudgementsRecord {
|
||||
set(8, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun values(value1: Int?, value2: Double?, value3: Double?, value4: Double?, value5: JudgementType?, value6: Double?, value7: Double?, value8: Double?, value9: Int?): ScoresJudgementsRecord {
|
||||
this.value1(value1)
|
||||
this.value2(value2)
|
||||
this.value3(value3)
|
||||
this.value4(value4)
|
||||
this.value5(value5)
|
||||
this.value6(value6)
|
||||
this.value7(value7)
|
||||
this.value8(value8)
|
||||
this.value9(value9)
|
||||
return this
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a detached, initialised ScoresJudgementsRecord
|
||||
*/
|
||||
constructor(id: Int? = null, time: Double? = null, x: Double? = null, y: Double? = null, type: JudgementType? = null, distanceCenter: Double? = null, distanceEdge: Double? = null, error: Double? = null, scoreId: Int? = null): this() {
|
||||
this.id = id
|
||||
this.time = time
|
||||
this.x = x
|
||||
this.y = y
|
||||
this.type = type
|
||||
this.distanceCenter = distanceCenter
|
||||
this.distanceEdge = distanceEdge
|
||||
this.error = error
|
||||
this.scoreId = scoreId
|
||||
resetChangedOnNotNull()
|
||||
}
|
||||
}
|
||||
@ -0,0 +1,211 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables.records
|
||||
|
||||
|
||||
import com.nisemoe.generated.tables.Scores
|
||||
|
||||
import java.time.LocalDateTime
|
||||
import java.time.OffsetDateTime
|
||||
|
||||
import org.jooq.Record1
|
||||
import org.jooq.impl.UpdatableRecordImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class ScoresRecord private constructor() : UpdatableRecordImpl<ScoresRecord>(Scores.SCORES) {
|
||||
|
||||
open var id: Int?
|
||||
set(value): Unit = set(0, value)
|
||||
get(): Int? = get(0) as Int?
|
||||
|
||||
open var beatmapId: Int?
|
||||
set(value): Unit = set(1, value)
|
||||
get(): Int? = get(1) as Int?
|
||||
|
||||
open var count_100: Int?
|
||||
set(value): Unit = set(2, value)
|
||||
get(): Int? = get(2) as Int?
|
||||
|
||||
open var count_300: Int?
|
||||
set(value): Unit = set(3, value)
|
||||
get(): Int? = get(3) as Int?
|
||||
|
||||
open var count_50: Int?
|
||||
set(value): Unit = set(4, value)
|
||||
get(): Int? = get(4) as Int?
|
||||
|
||||
open var countMiss: Int?
|
||||
set(value): Unit = set(5, value)
|
||||
get(): Int? = get(5) as Int?
|
||||
|
||||
open var date: LocalDateTime?
|
||||
set(value): Unit = set(6, value)
|
||||
get(): LocalDateTime? = get(6) as LocalDateTime?
|
||||
|
||||
open var maxCombo: Int?
|
||||
set(value): Unit = set(7, value)
|
||||
get(): Int? = get(7) as Int?
|
||||
|
||||
open var mods: Int?
|
||||
set(value): Unit = set(8, value)
|
||||
get(): Int? = get(8) as Int?
|
||||
|
||||
open var perfect: Boolean?
|
||||
set(value): Unit = set(9, value)
|
||||
get(): Boolean? = get(9) as Boolean?
|
||||
|
||||
open var pp: Double?
|
||||
set(value): Unit = set(10, value)
|
||||
get(): Double? = get(10) as Double?
|
||||
|
||||
open var rank: String?
|
||||
set(value): Unit = set(11, value)
|
||||
get(): String? = get(11) as String?
|
||||
|
||||
open var replayAvailable: Boolean?
|
||||
set(value): Unit = set(12, value)
|
||||
get(): Boolean? = get(12) as Boolean?
|
||||
|
||||
open var replayId: Long?
|
||||
set(value): Unit = set(13, value)
|
||||
get(): Long? = get(13) as Long?
|
||||
|
||||
open var score: Long?
|
||||
set(value): Unit = set(14, value)
|
||||
get(): Long? = get(14) as Long?
|
||||
|
||||
open var userId: Long?
|
||||
set(value): Unit = set(15, value)
|
||||
get(): Long? = get(15) as Long?
|
||||
|
||||
open var replay: ByteArray?
|
||||
set(value): Unit = set(16, value)
|
||||
get(): ByteArray? = get(16) as ByteArray?
|
||||
|
||||
open var ur: Double?
|
||||
set(value): Unit = set(17, value)
|
||||
get(): Double? = get(17) as Double?
|
||||
|
||||
open var frametime: Double?
|
||||
set(value): Unit = set(18, value)
|
||||
get(): Double? = get(18) as Double?
|
||||
|
||||
open var edgeHits: Int?
|
||||
set(value): Unit = set(19, value)
|
||||
get(): Int? = get(19) as Int?
|
||||
|
||||
open var snaps: Int?
|
||||
set(value): Unit = set(20, value)
|
||||
get(): Int? = get(20) as Int?
|
||||
|
||||
@Suppress("INAPPLICABLE_JVM_NAME")
|
||||
@set:JvmName("setIsBanned")
|
||||
open var isBanned: Boolean?
|
||||
set(value): Unit = set(21, value)
|
||||
get(): Boolean? = get(21) as Boolean?
|
||||
|
||||
open var adjustedUr: Double?
|
||||
set(value): Unit = set(22, value)
|
||||
get(): Double? = get(22) as Double?
|
||||
|
||||
open var meanError: Double?
|
||||
set(value): Unit = set(23, value)
|
||||
get(): Double? = get(23) as Double?
|
||||
|
||||
open var errorVariance: Double?
|
||||
set(value): Unit = set(24, value)
|
||||
get(): Double? = get(24) as Double?
|
||||
|
||||
open var errorStandardDeviation: Double?
|
||||
set(value): Unit = set(25, value)
|
||||
get(): Double? = get(25) as Double?
|
||||
|
||||
open var minimumError: Double?
|
||||
set(value): Unit = set(26, value)
|
||||
get(): Double? = get(26) as Double?
|
||||
|
||||
open var maximumError: Double?
|
||||
set(value): Unit = set(27, value)
|
||||
get(): Double? = get(27) as Double?
|
||||
|
||||
open var errorRange: Double?
|
||||
set(value): Unit = set(28, value)
|
||||
get(): Double? = get(28) as Double?
|
||||
|
||||
open var errorCoefficientOfVariation: Double?
|
||||
set(value): Unit = set(29, value)
|
||||
get(): Double? = get(29) as Double?
|
||||
|
||||
open var errorKurtosis: Double?
|
||||
set(value): Unit = set(30, value)
|
||||
get(): Double? = get(30) as Double?
|
||||
|
||||
open var errorSkewness: Double?
|
||||
set(value): Unit = set(31, value)
|
||||
get(): Double? = get(31) as Double?
|
||||
|
||||
open var sentDiscordNotification: Boolean?
|
||||
set(value): Unit = set(32, value)
|
||||
get(): Boolean? = get(32) as Boolean?
|
||||
|
||||
open var addedAt: OffsetDateTime?
|
||||
set(value): Unit = set(33, value)
|
||||
get(): OffsetDateTime? = get(33) as OffsetDateTime?
|
||||
|
||||
open var version: Int?
|
||||
set(value): Unit = set(34, value)
|
||||
get(): Int? = get(34) as Int?
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Primary key information
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun key(): Record1<Int?> = super.key() as Record1<Int?>
|
||||
|
||||
/**
|
||||
* Create a detached, initialised ScoresRecord
|
||||
*/
|
||||
constructor(id: Int? = null, beatmapId: Int? = null, count_100: Int? = null, count_300: Int? = null, count_50: Int? = null, countMiss: Int? = null, date: LocalDateTime? = null, maxCombo: Int? = null, mods: Int? = null, perfect: Boolean? = null, pp: Double? = null, rank: String? = null, replayAvailable: Boolean? = null, replayId: Long? = null, score: Long? = null, userId: Long? = null, replay: ByteArray? = null, ur: Double? = null, frametime: Double? = null, edgeHits: Int? = null, snaps: Int? = null, isBanned: Boolean? = null, adjustedUr: Double? = null, meanError: Double? = null, errorVariance: Double? = null, errorStandardDeviation: Double? = null, minimumError: Double? = null, maximumError: Double? = null, errorRange: Double? = null, errorCoefficientOfVariation: Double? = null, errorKurtosis: Double? = null, errorSkewness: Double? = null, sentDiscordNotification: Boolean? = null, addedAt: OffsetDateTime? = null, version: Int? = null): this() {
|
||||
this.id = id
|
||||
this.beatmapId = beatmapId
|
||||
this.count_100 = count_100
|
||||
this.count_300 = count_300
|
||||
this.count_50 = count_50
|
||||
this.countMiss = countMiss
|
||||
this.date = date
|
||||
this.maxCombo = maxCombo
|
||||
this.mods = mods
|
||||
this.perfect = perfect
|
||||
this.pp = pp
|
||||
this.rank = rank
|
||||
this.replayAvailable = replayAvailable
|
||||
this.replayId = replayId
|
||||
this.score = score
|
||||
this.userId = userId
|
||||
this.replay = replay
|
||||
this.ur = ur
|
||||
this.frametime = frametime
|
||||
this.edgeHits = edgeHits
|
||||
this.snaps = snaps
|
||||
this.isBanned = isBanned
|
||||
this.adjustedUr = adjustedUr
|
||||
this.meanError = meanError
|
||||
this.errorVariance = errorVariance
|
||||
this.errorStandardDeviation = errorStandardDeviation
|
||||
this.minimumError = minimumError
|
||||
this.maximumError = maximumError
|
||||
this.errorRange = errorRange
|
||||
this.errorCoefficientOfVariation = errorCoefficientOfVariation
|
||||
this.errorKurtosis = errorKurtosis
|
||||
this.errorSkewness = errorSkewness
|
||||
this.sentDiscordNotification = sentDiscordNotification
|
||||
this.addedAt = addedAt
|
||||
this.version = version
|
||||
resetChangedOnNotNull()
|
||||
}
|
||||
}
|
||||
@ -0,0 +1,187 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables.records
|
||||
|
||||
|
||||
import com.nisemoe.generated.tables.ScoresSimilarity
|
||||
|
||||
import java.time.LocalDateTime
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.Record1
|
||||
import org.jooq.Record10
|
||||
import org.jooq.Row10
|
||||
import org.jooq.impl.UpdatableRecordImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class ScoresSimilarityRecord private constructor() : UpdatableRecordImpl<ScoresSimilarityRecord>(ScoresSimilarity.SCORES_SIMILARITY), Record10<Int?, Int?, Long?, Long?, Double?, Double?, LocalDateTime?, Boolean?, Double?, Double?> {
|
||||
|
||||
open var id: Int?
|
||||
set(value): Unit = set(0, value)
|
||||
get(): Int? = get(0) as Int?
|
||||
|
||||
open var beatmapId: Int?
|
||||
set(value): Unit = set(1, value)
|
||||
get(): Int? = get(1) as Int?
|
||||
|
||||
open var replayId_1: Long?
|
||||
set(value): Unit = set(2, value)
|
||||
get(): Long? = get(2) as Long?
|
||||
|
||||
open var replayId_2: Long?
|
||||
set(value): Unit = set(3, value)
|
||||
get(): Long? = get(3) as Long?
|
||||
|
||||
open var similarity: Double?
|
||||
set(value): Unit = set(4, value)
|
||||
get(): Double? = get(4) as Double?
|
||||
|
||||
open var correlation: Double?
|
||||
set(value): Unit = set(5, value)
|
||||
get(): Double? = get(5) as Double?
|
||||
|
||||
open var createdAt: LocalDateTime?
|
||||
set(value): Unit = set(6, value)
|
||||
get(): LocalDateTime? = get(6) as LocalDateTime?
|
||||
|
||||
open var sentDiscordNotification: Boolean?
|
||||
set(value): Unit = set(7, value)
|
||||
get(): Boolean? = get(7) as Boolean?
|
||||
|
||||
open var cgSimilarity: Double?
|
||||
set(value): Unit = set(8, value)
|
||||
get(): Double? = get(8) as Double?
|
||||
|
||||
open var cgCorrelation: Double?
|
||||
set(value): Unit = set(9, value)
|
||||
get(): Double? = get(9) as Double?
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Primary key information
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun key(): Record1<Int?> = super.key() as Record1<Int?>
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Record10 type implementation
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun fieldsRow(): Row10<Int?, Int?, Long?, Long?, Double?, Double?, LocalDateTime?, Boolean?, Double?, Double?> = super.fieldsRow() as Row10<Int?, Int?, Long?, Long?, Double?, Double?, LocalDateTime?, Boolean?, Double?, Double?>
|
||||
override fun valuesRow(): Row10<Int?, Int?, Long?, Long?, Double?, Double?, LocalDateTime?, Boolean?, Double?, Double?> = super.valuesRow() as Row10<Int?, Int?, Long?, Long?, Double?, Double?, LocalDateTime?, Boolean?, Double?, Double?>
|
||||
override fun field1(): Field<Int?> = ScoresSimilarity.SCORES_SIMILARITY.ID
|
||||
override fun field2(): Field<Int?> = ScoresSimilarity.SCORES_SIMILARITY.BEATMAP_ID
|
||||
override fun field3(): Field<Long?> = ScoresSimilarity.SCORES_SIMILARITY.REPLAY_ID_1
|
||||
override fun field4(): Field<Long?> = ScoresSimilarity.SCORES_SIMILARITY.REPLAY_ID_2
|
||||
override fun field5(): Field<Double?> = ScoresSimilarity.SCORES_SIMILARITY.SIMILARITY
|
||||
override fun field6(): Field<Double?> = ScoresSimilarity.SCORES_SIMILARITY.CORRELATION
|
||||
override fun field7(): Field<LocalDateTime?> = ScoresSimilarity.SCORES_SIMILARITY.CREATED_AT
|
||||
override fun field8(): Field<Boolean?> = ScoresSimilarity.SCORES_SIMILARITY.SENT_DISCORD_NOTIFICATION
|
||||
override fun field9(): Field<Double?> = ScoresSimilarity.SCORES_SIMILARITY.CG_SIMILARITY
|
||||
override fun field10(): Field<Double?> = ScoresSimilarity.SCORES_SIMILARITY.CG_CORRELATION
|
||||
override fun component1(): Int? = id
|
||||
override fun component2(): Int? = beatmapId
|
||||
override fun component3(): Long? = replayId_1
|
||||
override fun component4(): Long? = replayId_2
|
||||
override fun component5(): Double? = similarity
|
||||
override fun component6(): Double? = correlation
|
||||
override fun component7(): LocalDateTime? = createdAt
|
||||
override fun component8(): Boolean? = sentDiscordNotification
|
||||
override fun component9(): Double? = cgSimilarity
|
||||
override fun component10(): Double? = cgCorrelation
|
||||
override fun value1(): Int? = id
|
||||
override fun value2(): Int? = beatmapId
|
||||
override fun value3(): Long? = replayId_1
|
||||
override fun value4(): Long? = replayId_2
|
||||
override fun value5(): Double? = similarity
|
||||
override fun value6(): Double? = correlation
|
||||
override fun value7(): LocalDateTime? = createdAt
|
||||
override fun value8(): Boolean? = sentDiscordNotification
|
||||
override fun value9(): Double? = cgSimilarity
|
||||
override fun value10(): Double? = cgCorrelation
|
||||
|
||||
override fun value1(value: Int?): ScoresSimilarityRecord {
|
||||
set(0, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value2(value: Int?): ScoresSimilarityRecord {
|
||||
set(1, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value3(value: Long?): ScoresSimilarityRecord {
|
||||
set(2, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value4(value: Long?): ScoresSimilarityRecord {
|
||||
set(3, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value5(value: Double?): ScoresSimilarityRecord {
|
||||
set(4, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value6(value: Double?): ScoresSimilarityRecord {
|
||||
set(5, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value7(value: LocalDateTime?): ScoresSimilarityRecord {
|
||||
set(6, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value8(value: Boolean?): ScoresSimilarityRecord {
|
||||
set(7, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value9(value: Double?): ScoresSimilarityRecord {
|
||||
set(8, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value10(value: Double?): ScoresSimilarityRecord {
|
||||
set(9, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun values(value1: Int?, value2: Int?, value3: Long?, value4: Long?, value5: Double?, value6: Double?, value7: LocalDateTime?, value8: Boolean?, value9: Double?, value10: Double?): ScoresSimilarityRecord {
|
||||
this.value1(value1)
|
||||
this.value2(value2)
|
||||
this.value3(value3)
|
||||
this.value4(value4)
|
||||
this.value5(value5)
|
||||
this.value6(value6)
|
||||
this.value7(value7)
|
||||
this.value8(value8)
|
||||
this.value9(value9)
|
||||
this.value10(value10)
|
||||
return this
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a detached, initialised ScoresSimilarityRecord
|
||||
*/
|
||||
constructor(id: Int? = null, beatmapId: Int? = null, replayId_1: Long? = null, replayId_2: Long? = null, similarity: Double? = null, correlation: Double? = null, createdAt: LocalDateTime? = null, sentDiscordNotification: Boolean? = null, cgSimilarity: Double? = null, cgCorrelation: Double? = null): this() {
|
||||
this.id = id
|
||||
this.beatmapId = beatmapId
|
||||
this.replayId_1 = replayId_1
|
||||
this.replayId_2 = replayId_2
|
||||
this.similarity = similarity
|
||||
this.correlation = correlation
|
||||
this.createdAt = createdAt
|
||||
this.sentDiscordNotification = sentDiscordNotification
|
||||
this.cgSimilarity = cgSimilarity
|
||||
this.cgCorrelation = cgCorrelation
|
||||
resetChangedOnNotNull()
|
||||
}
|
||||
}
|
||||
@ -0,0 +1,117 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables.records
|
||||
|
||||
|
||||
import com.nisemoe.generated.tables.UpdateUserQueue
|
||||
|
||||
import java.time.LocalDateTime
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.Record1
|
||||
import org.jooq.Record5
|
||||
import org.jooq.Row5
|
||||
import org.jooq.impl.UpdatableRecordImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class UpdateUserQueueRecord private constructor() : UpdatableRecordImpl<UpdateUserQueueRecord>(UpdateUserQueue.UPDATE_USER_QUEUE), Record5<Int?, Long?, Boolean?, LocalDateTime?, LocalDateTime?> {
|
||||
|
||||
open var id: Int?
|
||||
set(value): Unit = set(0, value)
|
||||
get(): Int? = get(0) as Int?
|
||||
|
||||
open var userId: Long
|
||||
set(value): Unit = set(1, value)
|
||||
get(): Long = get(1) as Long
|
||||
|
||||
open var processed: Boolean?
|
||||
set(value): Unit = set(2, value)
|
||||
get(): Boolean? = get(2) as Boolean?
|
||||
|
||||
open var createdAt: LocalDateTime?
|
||||
set(value): Unit = set(3, value)
|
||||
get(): LocalDateTime? = get(3) as LocalDateTime?
|
||||
|
||||
open var processedAt: LocalDateTime?
|
||||
set(value): Unit = set(4, value)
|
||||
get(): LocalDateTime? = get(4) as LocalDateTime?
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Primary key information
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun key(): Record1<Int?> = super.key() as Record1<Int?>
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Record5 type implementation
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun fieldsRow(): Row5<Int?, Long?, Boolean?, LocalDateTime?, LocalDateTime?> = super.fieldsRow() as Row5<Int?, Long?, Boolean?, LocalDateTime?, LocalDateTime?>
|
||||
override fun valuesRow(): Row5<Int?, Long?, Boolean?, LocalDateTime?, LocalDateTime?> = super.valuesRow() as Row5<Int?, Long?, Boolean?, LocalDateTime?, LocalDateTime?>
|
||||
override fun field1(): Field<Int?> = UpdateUserQueue.UPDATE_USER_QUEUE.ID
|
||||
override fun field2(): Field<Long?> = UpdateUserQueue.UPDATE_USER_QUEUE.USER_ID
|
||||
override fun field3(): Field<Boolean?> = UpdateUserQueue.UPDATE_USER_QUEUE.PROCESSED
|
||||
override fun field4(): Field<LocalDateTime?> = UpdateUserQueue.UPDATE_USER_QUEUE.CREATED_AT
|
||||
override fun field5(): Field<LocalDateTime?> = UpdateUserQueue.UPDATE_USER_QUEUE.PROCESSED_AT
|
||||
override fun component1(): Int? = id
|
||||
override fun component2(): Long = userId
|
||||
override fun component3(): Boolean? = processed
|
||||
override fun component4(): LocalDateTime? = createdAt
|
||||
override fun component5(): LocalDateTime? = processedAt
|
||||
override fun value1(): Int? = id
|
||||
override fun value2(): Long = userId
|
||||
override fun value3(): Boolean? = processed
|
||||
override fun value4(): LocalDateTime? = createdAt
|
||||
override fun value5(): LocalDateTime? = processedAt
|
||||
|
||||
override fun value1(value: Int?): UpdateUserQueueRecord {
|
||||
set(0, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value2(value: Long?): UpdateUserQueueRecord {
|
||||
set(1, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value3(value: Boolean?): UpdateUserQueueRecord {
|
||||
set(2, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value4(value: LocalDateTime?): UpdateUserQueueRecord {
|
||||
set(3, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value5(value: LocalDateTime?): UpdateUserQueueRecord {
|
||||
set(4, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun values(value1: Int?, value2: Long?, value3: Boolean?, value4: LocalDateTime?, value5: LocalDateTime?): UpdateUserQueueRecord {
|
||||
this.value1(value1)
|
||||
this.value2(value2)
|
||||
this.value3(value3)
|
||||
this.value4(value4)
|
||||
this.value5(value5)
|
||||
return this
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a detached, initialised UpdateUserQueueRecord
|
||||
*/
|
||||
constructor(id: Int? = null, userId: Long, processed: Boolean? = null, createdAt: LocalDateTime? = null, processedAt: LocalDateTime? = null): this() {
|
||||
this.id = id
|
||||
this.userId = userId
|
||||
this.processed = processed
|
||||
this.createdAt = createdAt
|
||||
this.processedAt = processedAt
|
||||
resetChangedOnNotNull()
|
||||
}
|
||||
}
|
||||
@ -0,0 +1,271 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables.records
|
||||
|
||||
|
||||
import com.nisemoe.generated.tables.Users
|
||||
|
||||
import java.time.LocalDateTime
|
||||
|
||||
import org.jooq.Field
|
||||
import org.jooq.Record1
|
||||
import org.jooq.Record16
|
||||
import org.jooq.Row16
|
||||
import org.jooq.impl.UpdatableRecordImpl
|
||||
|
||||
|
||||
/**
|
||||
* This class is generated by jOOQ.
|
||||
*/
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
open class UsersRecord private constructor() : UpdatableRecordImpl<UsersRecord>(Users.USERS), Record16<Long?, String?, LocalDateTime?, String?, Long?, Long?, Double?, Double?, Long?, Long?, Long?, Long?, Long?, Long?, Long?, LocalDateTime?> {
|
||||
|
||||
open var userId: Long?
|
||||
set(value): Unit = set(0, value)
|
||||
get(): Long? = get(0) as Long?
|
||||
|
||||
open var username: String?
|
||||
set(value): Unit = set(1, value)
|
||||
get(): String? = get(1) as String?
|
||||
|
||||
open var joinDate: LocalDateTime?
|
||||
set(value): Unit = set(2, value)
|
||||
get(): LocalDateTime? = get(2) as LocalDateTime?
|
||||
|
||||
open var country: String?
|
||||
set(value): Unit = set(3, value)
|
||||
get(): String? = get(3) as String?
|
||||
|
||||
open var countryRank: Long?
|
||||
set(value): Unit = set(4, value)
|
||||
get(): Long? = get(4) as Long?
|
||||
|
||||
open var rank: Long?
|
||||
set(value): Unit = set(5, value)
|
||||
get(): Long? = get(5) as Long?
|
||||
|
||||
open var ppRaw: Double?
|
||||
set(value): Unit = set(6, value)
|
||||
get(): Double? = get(6) as Double?
|
||||
|
||||
open var accuracy: Double?
|
||||
set(value): Unit = set(7, value)
|
||||
get(): Double? = get(7) as Double?
|
||||
|
||||
open var playcount: Long?
|
||||
set(value): Unit = set(8, value)
|
||||
get(): Long? = get(8) as Long?
|
||||
|
||||
open var totalScore: Long?
|
||||
set(value): Unit = set(9, value)
|
||||
get(): Long? = get(9) as Long?
|
||||
|
||||
open var rankedScore: Long?
|
||||
set(value): Unit = set(10, value)
|
||||
get(): Long? = get(10) as Long?
|
||||
|
||||
open var secondsPlayed: Long?
|
||||
set(value): Unit = set(11, value)
|
||||
get(): Long? = get(11) as Long?
|
||||
|
||||
open var count_100: Long?
|
||||
set(value): Unit = set(12, value)
|
||||
get(): Long? = get(12) as Long?
|
||||
|
||||
open var count_300: Long?
|
||||
set(value): Unit = set(13, value)
|
||||
get(): Long? = get(13) as Long?
|
||||
|
||||
open var count_50: Long?
|
||||
set(value): Unit = set(14, value)
|
||||
get(): Long? = get(14) as Long?
|
||||
|
||||
open var sysLastUpdate: LocalDateTime?
|
||||
set(value): Unit = set(15, value)
|
||||
get(): LocalDateTime? = get(15) as LocalDateTime?
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Primary key information
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun key(): Record1<Long?> = super.key() as Record1<Long?>
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Record16 type implementation
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
override fun fieldsRow(): Row16<Long?, String?, LocalDateTime?, String?, Long?, Long?, Double?, Double?, Long?, Long?, Long?, Long?, Long?, Long?, Long?, LocalDateTime?> = super.fieldsRow() as Row16<Long?, String?, LocalDateTime?, String?, Long?, Long?, Double?, Double?, Long?, Long?, Long?, Long?, Long?, Long?, Long?, LocalDateTime?>
|
||||
override fun valuesRow(): Row16<Long?, String?, LocalDateTime?, String?, Long?, Long?, Double?, Double?, Long?, Long?, Long?, Long?, Long?, Long?, Long?, LocalDateTime?> = super.valuesRow() as Row16<Long?, String?, LocalDateTime?, String?, Long?, Long?, Double?, Double?, Long?, Long?, Long?, Long?, Long?, Long?, Long?, LocalDateTime?>
|
||||
override fun field1(): Field<Long?> = Users.USERS.USER_ID
|
||||
override fun field2(): Field<String?> = Users.USERS.USERNAME
|
||||
override fun field3(): Field<LocalDateTime?> = Users.USERS.JOIN_DATE
|
||||
override fun field4(): Field<String?> = Users.USERS.COUNTRY
|
||||
override fun field5(): Field<Long?> = Users.USERS.COUNTRY_RANK
|
||||
override fun field6(): Field<Long?> = Users.USERS.RANK
|
||||
override fun field7(): Field<Double?> = Users.USERS.PP_RAW
|
||||
override fun field8(): Field<Double?> = Users.USERS.ACCURACY
|
||||
override fun field9(): Field<Long?> = Users.USERS.PLAYCOUNT
|
||||
override fun field10(): Field<Long?> = Users.USERS.TOTAL_SCORE
|
||||
override fun field11(): Field<Long?> = Users.USERS.RANKED_SCORE
|
||||
override fun field12(): Field<Long?> = Users.USERS.SECONDS_PLAYED
|
||||
override fun field13(): Field<Long?> = Users.USERS.COUNT_100
|
||||
override fun field14(): Field<Long?> = Users.USERS.COUNT_300
|
||||
override fun field15(): Field<Long?> = Users.USERS.COUNT_50
|
||||
override fun field16(): Field<LocalDateTime?> = Users.USERS.SYS_LAST_UPDATE
|
||||
override fun component1(): Long? = userId
|
||||
override fun component2(): String? = username
|
||||
override fun component3(): LocalDateTime? = joinDate
|
||||
override fun component4(): String? = country
|
||||
override fun component5(): Long? = countryRank
|
||||
override fun component6(): Long? = rank
|
||||
override fun component7(): Double? = ppRaw
|
||||
override fun component8(): Double? = accuracy
|
||||
override fun component9(): Long? = playcount
|
||||
override fun component10(): Long? = totalScore
|
||||
override fun component11(): Long? = rankedScore
|
||||
override fun component12(): Long? = secondsPlayed
|
||||
override fun component13(): Long? = count_100
|
||||
override fun component14(): Long? = count_300
|
||||
override fun component15(): Long? = count_50
|
||||
override fun component16(): LocalDateTime? = sysLastUpdate
|
||||
override fun value1(): Long? = userId
|
||||
override fun value2(): String? = username
|
||||
override fun value3(): LocalDateTime? = joinDate
|
||||
override fun value4(): String? = country
|
||||
override fun value5(): Long? = countryRank
|
||||
override fun value6(): Long? = rank
|
||||
override fun value7(): Double? = ppRaw
|
||||
override fun value8(): Double? = accuracy
|
||||
override fun value9(): Long? = playcount
|
||||
override fun value10(): Long? = totalScore
|
||||
override fun value11(): Long? = rankedScore
|
||||
override fun value12(): Long? = secondsPlayed
|
||||
override fun value13(): Long? = count_100
|
||||
override fun value14(): Long? = count_300
|
||||
override fun value15(): Long? = count_50
|
||||
override fun value16(): LocalDateTime? = sysLastUpdate
|
||||
|
||||
override fun value1(value: Long?): UsersRecord {
|
||||
set(0, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value2(value: String?): UsersRecord {
|
||||
set(1, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value3(value: LocalDateTime?): UsersRecord {
|
||||
set(2, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value4(value: String?): UsersRecord {
|
||||
set(3, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value5(value: Long?): UsersRecord {
|
||||
set(4, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value6(value: Long?): UsersRecord {
|
||||
set(5, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value7(value: Double?): UsersRecord {
|
||||
set(6, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value8(value: Double?): UsersRecord {
|
||||
set(7, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value9(value: Long?): UsersRecord {
|
||||
set(8, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value10(value: Long?): UsersRecord {
|
||||
set(9, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value11(value: Long?): UsersRecord {
|
||||
set(10, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value12(value: Long?): UsersRecord {
|
||||
set(11, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value13(value: Long?): UsersRecord {
|
||||
set(12, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value14(value: Long?): UsersRecord {
|
||||
set(13, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value15(value: Long?): UsersRecord {
|
||||
set(14, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun value16(value: LocalDateTime?): UsersRecord {
|
||||
set(15, value)
|
||||
return this
|
||||
}
|
||||
|
||||
override fun values(value1: Long?, value2: String?, value3: LocalDateTime?, value4: String?, value5: Long?, value6: Long?, value7: Double?, value8: Double?, value9: Long?, value10: Long?, value11: Long?, value12: Long?, value13: Long?, value14: Long?, value15: Long?, value16: LocalDateTime?): UsersRecord {
|
||||
this.value1(value1)
|
||||
this.value2(value2)
|
||||
this.value3(value3)
|
||||
this.value4(value4)
|
||||
this.value5(value5)
|
||||
this.value6(value6)
|
||||
this.value7(value7)
|
||||
this.value8(value8)
|
||||
this.value9(value9)
|
||||
this.value10(value10)
|
||||
this.value11(value11)
|
||||
this.value12(value12)
|
||||
this.value13(value13)
|
||||
this.value14(value14)
|
||||
this.value15(value15)
|
||||
this.value16(value16)
|
||||
return this
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a detached, initialised UsersRecord
|
||||
*/
|
||||
constructor(userId: Long? = null, username: String? = null, joinDate: LocalDateTime? = null, country: String? = null, countryRank: Long? = null, rank: Long? = null, ppRaw: Double? = null, accuracy: Double? = null, playcount: Long? = null, totalScore: Long? = null, rankedScore: Long? = null, secondsPlayed: Long? = null, count_100: Long? = null, count_300: Long? = null, count_50: Long? = null, sysLastUpdate: LocalDateTime? = null): this() {
|
||||
this.userId = userId
|
||||
this.username = username
|
||||
this.joinDate = joinDate
|
||||
this.country = country
|
||||
this.countryRank = countryRank
|
||||
this.rank = rank
|
||||
this.ppRaw = ppRaw
|
||||
this.accuracy = accuracy
|
||||
this.playcount = playcount
|
||||
this.totalScore = totalScore
|
||||
this.rankedScore = rankedScore
|
||||
this.secondsPlayed = secondsPlayed
|
||||
this.count_100 = count_100
|
||||
this.count_300 = count_300
|
||||
this.count_50 = count_50
|
||||
this.sysLastUpdate = sysLastUpdate
|
||||
resetChangedOnNotNull()
|
||||
}
|
||||
}
|
||||
@ -0,0 +1,56 @@
|
||||
/*
|
||||
* This file is generated by jOOQ.
|
||||
*/
|
||||
package com.nisemoe.generated.tables.references
|
||||
|
||||
|
||||
import com.nisemoe.generated.tables.Beatmaps
|
||||
import com.nisemoe.generated.tables.FlywaySchemaHistory
|
||||
import com.nisemoe.generated.tables.RedditPost
|
||||
import com.nisemoe.generated.tables.Scores
|
||||
import com.nisemoe.generated.tables.ScoresJudgements
|
||||
import com.nisemoe.generated.tables.ScoresSimilarity
|
||||
import com.nisemoe.generated.tables.UpdateUserQueue
|
||||
import com.nisemoe.generated.tables.Users
|
||||
|
||||
|
||||
|
||||
/**
|
||||
* The table <code>public.beatmaps</code>.
|
||||
*/
|
||||
val BEATMAPS: Beatmaps = Beatmaps.BEATMAPS
|
||||
|
||||
/**
|
||||
* The table <code>public.flyway_schema_history</code>.
|
||||
*/
|
||||
val FLYWAY_SCHEMA_HISTORY: FlywaySchemaHistory = FlywaySchemaHistory.FLYWAY_SCHEMA_HISTORY
|
||||
|
||||
/**
|
||||
* The table <code>public.reddit_post</code>.
|
||||
*/
|
||||
val REDDIT_POST: RedditPost = RedditPost.REDDIT_POST
|
||||
|
||||
/**
|
||||
* The table <code>public.scores</code>.
|
||||
*/
|
||||
val SCORES: Scores = Scores.SCORES
|
||||
|
||||
/**
|
||||
* The table <code>public.scores_judgements</code>.
|
||||
*/
|
||||
val SCORES_JUDGEMENTS: ScoresJudgements = ScoresJudgements.SCORES_JUDGEMENTS
|
||||
|
||||
/**
|
||||
* The table <code>public.scores_similarity</code>.
|
||||
*/
|
||||
val SCORES_SIMILARITY: ScoresSimilarity = ScoresSimilarity.SCORES_SIMILARITY
|
||||
|
||||
/**
|
||||
* The table <code>public.update_user_queue</code>.
|
||||
*/
|
||||
val UPDATE_USER_QUEUE: UpdateUserQueue = UpdateUserQueue.UPDATE_USER_QUEUE
|
||||
|
||||
/**
|
||||
* The table <code>public.users</code>.
|
||||
*/
|
||||
val USERS: Users = Users.USERS
|
||||
38
nise-backend/src/main/kotlin/com/nisemoe/nise/Format.kt
Normal file
38
nise-backend/src/main/kotlin/com/nisemoe/nise/Format.kt
Normal file
@ -0,0 +1,38 @@
|
||||
package com.nisemoe.nise
|
||||
|
||||
import com.nisemoe.generated.enums.JudgementType
|
||||
import com.nisemoe.nise.integrations.CircleguardService
|
||||
import java.time.LocalDateTime
|
||||
import java.time.ZoneOffset
|
||||
import java.time.format.DateTimeFormatter
|
||||
import java.util.*
|
||||
|
||||
|
||||
class Format {
|
||||
|
||||
companion object {
|
||||
|
||||
private var targetFormatter: DateTimeFormatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm")
|
||||
|
||||
fun formatLocalDateTime(dateTime: LocalDateTime): String {
|
||||
return dateTime
|
||||
.plusHours(2)
|
||||
.format(targetFormatter)
|
||||
}
|
||||
|
||||
fun parseStringToDate(dateTimeStr: String): Date {
|
||||
val localDateTime = LocalDateTime.parse(dateTimeStr, targetFormatter)
|
||||
return Date.from(localDateTime.atZone(ZoneOffset.UTC).toInstant())
|
||||
}
|
||||
|
||||
fun fromJudgementType(circleGuardJudgementType: CircleguardService.JudgementType): JudgementType {
|
||||
return when (circleGuardJudgementType) {
|
||||
CircleguardService.JudgementType.THREE_HUNDRED -> JudgementType.`300`
|
||||
CircleguardService.JudgementType.ONE_HUNDRED -> JudgementType.`100`
|
||||
CircleguardService.JudgementType.FIFTY -> JudgementType.`50`
|
||||
CircleguardService.JudgementType.MISS -> JudgementType.Miss
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
144
nise-backend/src/main/kotlin/com/nisemoe/nise/Models.kt
Normal file
144
nise-backend/src/main/kotlin/com/nisemoe/nise/Models.kt
Normal file
@ -0,0 +1,144 @@
|
||||
package com.nisemoe.nise
|
||||
|
||||
import kotlinx.serialization.Serializable
|
||||
|
||||
data class UserDetails(
|
||||
val user_id: Long,
|
||||
val username: String,
|
||||
val rank: Long?,
|
||||
val pp_raw: Double?,
|
||||
val join_date: String?,
|
||||
val seconds_played: Long?,
|
||||
val country: String?,
|
||||
val country_rank: Long?,
|
||||
val playcount: Long?,
|
||||
|
||||
val suspicious_scores: Int = 0,
|
||||
val stolen_replays: Int = 0
|
||||
)
|
||||
|
||||
@Serializable
|
||||
@AllowCacheSerialization
|
||||
data class Statistics(
|
||||
val total_beatmaps: Int,
|
||||
val total_users: Int,
|
||||
val total_scores: Int,
|
||||
val total_replay_scores: Int,
|
||||
val total_replay_similarity: Int
|
||||
)
|
||||
|
||||
data class SuspiciousScoreEntry(
|
||||
val user_id: Long,
|
||||
val username: String,
|
||||
val replay_id: Long,
|
||||
val date: String,
|
||||
val beatmap_id: Long,
|
||||
val beatmap_beatmapset_id: Long,
|
||||
val beatmap_title: String,
|
||||
val beatmap_star_rating: Double,
|
||||
val pp: Double,
|
||||
val frametime: Double,
|
||||
val ur: Double
|
||||
)
|
||||
|
||||
data class SimilarReplayEntry(
|
||||
val replay_id_1: Long,
|
||||
val replay_id_2: Long,
|
||||
val user_id_1: Long,
|
||||
val user_id_2: Long,
|
||||
val username_1: String,
|
||||
val username_2: String,
|
||||
val beatmap_beatmapset_id: Long,
|
||||
val replay_date_1: String,
|
||||
val replay_date_2: String,
|
||||
val replay_pp_1: Double,
|
||||
val replay_pp_2: Double,
|
||||
val beatmap_id: Long,
|
||||
val beatmap_title: String,
|
||||
val beatmap_star_rating: Double,
|
||||
val similarity: Double
|
||||
)
|
||||
|
||||
data class ReplayPairStatistics(
|
||||
val similarity: Double,
|
||||
val correlation: Double,
|
||||
)
|
||||
|
||||
data class ReplayPair(
|
||||
val replays: List<ReplayData>,
|
||||
val statistics: ReplayPairStatistics
|
||||
)
|
||||
|
||||
data class ReplayData(
|
||||
val replay_id: Long,
|
||||
val user_id: Int,
|
||||
val username: String,
|
||||
val date: String,
|
||||
val beatmap_id: Int,
|
||||
val beatmap_beatmapset_id: Int,
|
||||
val beatmap_artist: String,
|
||||
val beatmap_title: String,
|
||||
val beatmap_star_rating: Double,
|
||||
val beatmap_creator: String,
|
||||
val beatmap_version: String,
|
||||
val score: Int,
|
||||
val mods: List<String>,
|
||||
val rank: String,
|
||||
val ur: Double?,
|
||||
val adjusted_ur: Double?,
|
||||
val average_ur: Double?,
|
||||
val frametime: Int,
|
||||
val snaps: Int,
|
||||
val hits: Int,
|
||||
|
||||
var mean_error: Double?,
|
||||
var error_variance: Double?,
|
||||
var error_standard_deviation: Double?,
|
||||
var minimum_error: Double?,
|
||||
var maximum_error: Double?,
|
||||
var error_range: Double?,
|
||||
var error_coefficient_of_variation: Double?,
|
||||
var error_kurtosis: Double?,
|
||||
var error_skewness: Double?,
|
||||
|
||||
var comparable_samples: Int? = null,
|
||||
var comparable_mean_error: Double? = null,
|
||||
var comparable_error_variance: Double? = null,
|
||||
var comparable_error_standard_deviation: Double? = null,
|
||||
var comparable_minimum_error: Double? = null,
|
||||
var comparable_maximum_error: Double? = null,
|
||||
var comparable_error_range: Double? = null,
|
||||
var comparable_error_coefficient_of_variation: Double? = null,
|
||||
var comparable_error_kurtosis: Double? = null,
|
||||
var comparable_error_skewness: Double? = null,
|
||||
|
||||
val pp: Double,
|
||||
val perfect: Boolean,
|
||||
val max_combo: Int,
|
||||
|
||||
val count_300: Int,
|
||||
val count_100: Int,
|
||||
val count_50: Int,
|
||||
val count_miss: Int,
|
||||
|
||||
val error_distribution: Map<Int, DistributionEntry>
|
||||
) {
|
||||
|
||||
fun calculateAccuracy(): Double {
|
||||
if (count_300 + count_100 + count_50 + count_miss == 0) {
|
||||
return 0.0
|
||||
}
|
||||
|
||||
val totalHits = count_300 + count_100 + count_50 + count_miss
|
||||
val accuracy = (300.0 * count_300 + 100.0 * count_100 + 50.0 * count_50) / (300.0 * totalHits)
|
||||
return accuracy * 100
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
data class DistributionEntry(
|
||||
val percentageMiss: Double,
|
||||
val percentage300: Double,
|
||||
val percentage100: Double,
|
||||
val percentage50: Double
|
||||
)
|
||||
@ -0,0 +1,20 @@
|
||||
package com.nisemoe.nise
|
||||
|
||||
import org.springframework.boot.autoconfigure.SpringBootApplication
|
||||
import org.springframework.boot.context.properties.EnableConfigurationProperties
|
||||
import org.springframework.boot.runApplication
|
||||
import org.springframework.cache.annotation.EnableCaching
|
||||
import org.springframework.scheduling.annotation.EnableScheduling
|
||||
|
||||
@SpringBootApplication
|
||||
@EnableCaching
|
||||
@EnableScheduling
|
||||
class NiseApplication
|
||||
|
||||
fun main(args: Array<String>) {
|
||||
runApplication<NiseApplication>(*args)
|
||||
}
|
||||
|
||||
@Target(AnnotationTarget.CLASS)
|
||||
@Retention(AnnotationRetention.RUNTIME)
|
||||
annotation class AllowCacheSerialization
|
||||
@ -0,0 +1,40 @@
|
||||
package com.nisemoe.nise.config
|
||||
|
||||
import com.fasterxml.jackson.annotation.JsonInclude
|
||||
import com.fasterxml.jackson.annotation.JsonTypeInfo
|
||||
import com.fasterxml.jackson.databind.ObjectMapper
|
||||
import com.fasterxml.jackson.databind.json.JsonMapper
|
||||
import com.fasterxml.jackson.datatype.jsr310.JavaTimeModule
|
||||
import org.springframework.context.annotation.Bean
|
||||
import org.springframework.context.annotation.Configuration
|
||||
import org.springframework.data.redis.connection.RedisConnectionFactory
|
||||
import org.springframework.data.redis.core.RedisTemplate
|
||||
import org.springframework.data.redis.serializer.GenericJackson2JsonRedisSerializer
|
||||
import org.springframework.data.redis.serializer.StringRedisSerializer
|
||||
|
||||
|
||||
@Configuration
|
||||
class RedisConfig {
|
||||
|
||||
@Bean
|
||||
fun redisTemplate(connectionFactory: RedisConnectionFactory?): RedisTemplate<Any, Any> {
|
||||
val redisTemplate = RedisTemplate<Any, Any>()
|
||||
redisTemplate.connectionFactory = connectionFactory
|
||||
|
||||
val jsonMapper = ObjectMapper()
|
||||
jsonMapper.registerModule(JavaTimeModule())
|
||||
jsonMapper.setSerializationInclusion(JsonInclude.Include.NON_NULL);
|
||||
jsonMapper.activateDefaultTyping(jsonMapper.polymorphicTypeValidator, ObjectMapper.DefaultTyping.EVERYTHING, JsonTypeInfo.As.PROPERTY);
|
||||
|
||||
val genericJackson2JsonRedisSerializer = GenericJackson2JsonRedisSerializer(jsonMapper)
|
||||
|
||||
redisTemplate.keySerializer = genericJackson2JsonRedisSerializer
|
||||
redisTemplate.valueSerializer = genericJackson2JsonRedisSerializer
|
||||
redisTemplate.hashKeySerializer = genericJackson2JsonRedisSerializer
|
||||
redisTemplate.hashValueSerializer = genericJackson2JsonRedisSerializer
|
||||
|
||||
return redisTemplate
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
@ -0,0 +1,26 @@
|
||||
package com.nisemoe.nise.config;
|
||||
|
||||
import org.springframework.context.annotation.Bean;
|
||||
import org.springframework.context.annotation.Configuration;
|
||||
import org.springframework.scheduling.annotation.EnableScheduling;
|
||||
import org.springframework.scheduling.annotation.SchedulingConfigurer;
|
||||
import org.springframework.scheduling.config.ScheduledTaskRegistrar;
|
||||
|
||||
import java.util.concurrent.Executor;
|
||||
import java.util.concurrent.Executors;
|
||||
|
||||
@Configuration
|
||||
@EnableScheduling
|
||||
public class SchedulerConfig implements SchedulingConfigurer {
|
||||
|
||||
@Override
|
||||
public void configureTasks(ScheduledTaskRegistrar taskRegistrar) {
|
||||
taskRegistrar.setScheduler(taskExecutor());
|
||||
}
|
||||
|
||||
@Bean
|
||||
public Executor taskExecutor() {
|
||||
return Executors.newScheduledThreadPool(100);
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,22 @@
|
||||
package com.nisemoe.nise.config
|
||||
|
||||
import org.springframework.beans.factory.annotation.Value
|
||||
import org.springframework.context.annotation.Configuration
|
||||
import org.springframework.web.servlet.config.annotation.CorsRegistry
|
||||
import org.springframework.web.servlet.config.annotation.EnableWebMvc
|
||||
import org.springframework.web.servlet.config.annotation.WebMvcConfigurer
|
||||
|
||||
|
||||
@Configuration
|
||||
@EnableWebMvc
|
||||
class WebConfig: WebMvcConfigurer {
|
||||
|
||||
@Value("\${ORIGIN:http://localhost:4200}")
|
||||
private lateinit var origin: String
|
||||
|
||||
override fun addCorsMappings(registry: CorsRegistry) {
|
||||
registry.addMapping("/**")
|
||||
.allowedOrigins(origin)
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,27 @@
|
||||
package com.nisemoe.nise.config
|
||||
|
||||
import org.springframework.beans.factory.annotation.Value
|
||||
import org.springframework.context.annotation.Configuration
|
||||
import org.springframework.messaging.simp.config.MessageBrokerRegistry
|
||||
import org.springframework.web.socket.config.annotation.EnableWebSocketMessageBroker
|
||||
import org.springframework.web.socket.config.annotation.StompEndpointRegistry
|
||||
import org.springframework.web.socket.config.annotation.WebSocketMessageBrokerConfigurer
|
||||
|
||||
@Configuration
|
||||
@EnableWebSocketMessageBroker
|
||||
class WebSocketConfig : WebSocketMessageBrokerConfigurer {
|
||||
|
||||
@Value("\${ORIGIN:http://localhost:4200}")
|
||||
private lateinit var websocketOrigin: String
|
||||
|
||||
override fun configureMessageBroker(config: MessageBrokerRegistry) {
|
||||
config.enableSimpleBroker("/topic")
|
||||
config.setApplicationDestinationPrefixes("/app")
|
||||
}
|
||||
|
||||
override fun registerStompEndpoints(registry: StompEndpointRegistry) {
|
||||
registry.addEndpoint("/websocket")
|
||||
.setAllowedOrigins(websocketOrigin)
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,45 @@
|
||||
package com.nisemoe.nise.controller
|
||||
|
||||
import com.nisemoe.nise.Format
|
||||
import com.nisemoe.nise.ReplayData
|
||||
import com.nisemoe.nise.ReplayPair
|
||||
import com.nisemoe.nise.database.ScoreService
|
||||
import org.springframework.http.ResponseEntity
|
||||
import org.springframework.web.bind.annotation.GetMapping
|
||||
import org.springframework.web.bind.annotation.PathVariable
|
||||
import org.springframework.web.bind.annotation.RestController
|
||||
|
||||
@RestController
|
||||
class ScoreController(
|
||||
private val scoreService: ScoreService
|
||||
) {
|
||||
|
||||
@GetMapping("score/{replayId}")
|
||||
fun getScoreDetails(@PathVariable replayId: Long): ResponseEntity<ReplayData> {
|
||||
val replayData = this.scoreService.getReplayData(replayId)
|
||||
?: return ResponseEntity.notFound().build()
|
||||
|
||||
return ResponseEntity.ok(replayData)
|
||||
}
|
||||
|
||||
@GetMapping("pair/{replay1Id}/{replay2Id}")
|
||||
fun getPairDetails(@PathVariable replay1Id: Long, @PathVariable replay2Id: Long): ResponseEntity<ReplayPair> {
|
||||
val replay1Data = this.scoreService.getReplayData(replay1Id)
|
||||
?: return ResponseEntity.notFound().build()
|
||||
|
||||
val replay2Data = this.scoreService.getReplayData(replay2Id)
|
||||
?: return ResponseEntity.notFound().build()
|
||||
|
||||
val replayPairStatistics = this.scoreService.getPairInfo(replay1Id, replay2Id)
|
||||
?: return ResponseEntity.notFound().build()
|
||||
|
||||
// Sort replays by date (the first replay will always be the oldest)
|
||||
val replays = listOf(replay1Data, replay2Data)
|
||||
.sortedBy { Format.parseStringToDate(it.date) }
|
||||
|
||||
val replayPair = ReplayPair(replays, replayPairStatistics)
|
||||
|
||||
return ResponseEntity.ok(replayPair)
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,22 @@
|
||||
package com.nisemoe.nise.controller
|
||||
|
||||
import com.nisemoe.nise.Statistics
|
||||
import com.nisemoe.nise.scheduler.GlobalCache
|
||||
import org.springframework.http.ResponseEntity
|
||||
import org.springframework.web.bind.annotation.GetMapping
|
||||
import org.springframework.web.bind.annotation.RestController
|
||||
|
||||
@RestController
|
||||
class StatisticsController (
|
||||
private val globalCache: GlobalCache
|
||||
) {
|
||||
|
||||
@GetMapping("stats")
|
||||
fun getStatistics(): ResponseEntity<Statistics> {
|
||||
val statistics = this.globalCache.statistics
|
||||
?: return ResponseEntity.status(503).build()
|
||||
|
||||
return ResponseEntity.ok(statistics)
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,22 @@
|
||||
package com.nisemoe.nise.controller
|
||||
|
||||
import com.nisemoe.nise.SimilarReplayEntry
|
||||
import com.nisemoe.nise.scheduler.GlobalCache
|
||||
import org.springframework.http.ResponseEntity
|
||||
import org.springframework.web.bind.annotation.GetMapping
|
||||
import org.springframework.web.bind.annotation.RestController
|
||||
|
||||
@RestController
|
||||
class StolenReplaysController(
|
||||
private val globalCache: GlobalCache
|
||||
) {
|
||||
|
||||
@GetMapping("similar-replays")
|
||||
fun getStolenReplays(): ResponseEntity<List<SimilarReplayEntry>> {
|
||||
val stolenReplays = this.globalCache.similarReplays
|
||||
?: return ResponseEntity.status(503).build()
|
||||
|
||||
return ResponseEntity.ok(stolenReplays)
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,24 @@
|
||||
package com.nisemoe.nise.controller
|
||||
|
||||
import com.nisemoe.nise.SuspiciousScoreEntry
|
||||
import com.nisemoe.nise.scheduler.GlobalCache
|
||||
import org.springframework.http.MediaType
|
||||
import org.springframework.http.ResponseEntity
|
||||
import org.springframework.web.bind.annotation.GetMapping
|
||||
import org.springframework.web.bind.annotation.RestController
|
||||
|
||||
@RestController
|
||||
class SuspiciousScoresController(
|
||||
private val globalCache: GlobalCache
|
||||
) {
|
||||
|
||||
@GetMapping("suspicious-scores", produces = [MediaType.APPLICATION_JSON_VALUE])
|
||||
fun getSuspiciousScores(): ResponseEntity<List<SuspiciousScoreEntry>> {
|
||||
val scores = this.globalCache.suspiciousScores
|
||||
?: return ResponseEntity.status(503).build()
|
||||
|
||||
return ResponseEntity.ok(scores)
|
||||
}
|
||||
|
||||
|
||||
}
|
||||
@ -0,0 +1,42 @@
|
||||
package com.nisemoe.nise.controller
|
||||
|
||||
import com.nisemoe.generated.tables.references.SCORES
|
||||
import com.nisemoe.nise.SimilarReplayEntry
|
||||
import com.nisemoe.nise.database.ScoreService
|
||||
import com.nisemoe.nise.SuspiciousScoreEntry
|
||||
import com.nisemoe.nise.UserDetails
|
||||
import com.nisemoe.nise.database.UserService
|
||||
import org.springframework.http.ResponseEntity
|
||||
import org.springframework.web.bind.annotation.GetMapping
|
||||
import org.springframework.web.bind.annotation.PathVariable
|
||||
import org.springframework.web.bind.annotation.RestController
|
||||
|
||||
@RestController
|
||||
class UserDetailsController(
|
||||
private val scoreService: ScoreService,
|
||||
private val userService: UserService
|
||||
) {
|
||||
|
||||
data class UserDetailsResponse(
|
||||
val user_details: UserDetails,
|
||||
val suspicious_scores: List<SuspiciousScoreEntry>,
|
||||
val similar_replays: List<SimilarReplayEntry>
|
||||
)
|
||||
|
||||
@GetMapping("user-details/{userId}")
|
||||
fun getUserDetails(@PathVariable userId: String): ResponseEntity<UserDetailsResponse> {
|
||||
val userDetails = this.userService.getUserDetails(username = userId)
|
||||
?: return ResponseEntity.notFound().build()
|
||||
|
||||
var suspiciousScoresCondition = this.scoreService.getDefaultCondition()
|
||||
suspiciousScoresCondition = suspiciousScoresCondition.and(SCORES.USER_ID.eq(userDetails.user_id))
|
||||
|
||||
val response = UserDetailsResponse(
|
||||
user_details = userDetails,
|
||||
suspicious_scores = this.scoreService.getSuspiciousScores(suspiciousScoresCondition),
|
||||
similar_replays = this.scoreService.getSimilarReplaysForUserId(userDetails.user_id)
|
||||
)
|
||||
return ResponseEntity.ok(response)
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,30 @@
|
||||
package com.nisemoe.nise.database
|
||||
|
||||
import com.nisemoe.generated.tables.references.SCORES
|
||||
import org.jooq.DSLContext
|
||||
import org.jooq.impl.DSL.avg
|
||||
import org.springframework.stereotype.Service
|
||||
|
||||
@Service
|
||||
class BeatmapService(private val dslContext: DSLContext) {
|
||||
|
||||
fun getAverageUR(beatmapId: Int, excludeReplayId: Long): Double? {
|
||||
val condition = SCORES.BEATMAP_ID.eq(beatmapId)
|
||||
.and(SCORES.UR.isNotNull)
|
||||
.and(SCORES.REPLAY_ID.notEqual(excludeReplayId))
|
||||
|
||||
val totalScores = dslContext.fetchCount(
|
||||
SCORES, condition
|
||||
)
|
||||
|
||||
if(totalScores < 50)
|
||||
return null
|
||||
|
||||
return dslContext
|
||||
.select(avg(SCORES.UR))
|
||||
.from(SCORES)
|
||||
.where(condition)
|
||||
.fetchOneInto(Double::class.java)
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,322 @@
|
||||
package com.nisemoe.nise.database
|
||||
|
||||
import com.nisemoe.generated.tables.records.ScoresJudgementsRecord
|
||||
import com.nisemoe.generated.tables.references.*
|
||||
import com.nisemoe.nise.*
|
||||
import com.nisemoe.nise.osu.Mod
|
||||
import org.jooq.Condition
|
||||
import org.jooq.DSLContext
|
||||
import org.jooq.Record
|
||||
import org.jooq.Result
|
||||
import org.jooq.impl.DSL
|
||||
import org.jooq.impl.DSL.avg
|
||||
import org.springframework.cache.annotation.Cacheable
|
||||
import org.springframework.stereotype.Service
|
||||
import java.time.LocalDateTime
|
||||
import kotlin.math.roundToInt
|
||||
|
||||
@Service
|
||||
class ScoreService(
|
||||
private val dslContext: DSLContext,
|
||||
private val beatmapService: BeatmapService
|
||||
) {
|
||||
|
||||
companion object {
|
||||
|
||||
val osuScoreAlias1 = SCORES.`as`("osu_score_alias1")
|
||||
val osuScoreAlias2 = SCORES.`as`("osu_score_alias2")
|
||||
val osuUserAlias1 = USERS.`as`("osu_user_alias1")
|
||||
val osuUserAlias2 = USERS.`as`("osu_user_alias2")
|
||||
|
||||
}
|
||||
|
||||
fun getReplayData(replayId: Long): ReplayData? {
|
||||
val result = dslContext.select(DSL.asterisk())
|
||||
.from(SCORES)
|
||||
.join(USERS).on(SCORES.USER_ID.eq(USERS.USER_ID))
|
||||
.join(BEATMAPS).on(SCORES.BEATMAP_ID.eq(BEATMAPS.BEATMAP_ID))
|
||||
.where(SCORES.REPLAY_ID.eq(replayId))
|
||||
.fetchOne() ?: return null
|
||||
|
||||
val beatmapId = result.get(BEATMAPS.BEATMAP_ID, Int::class.java)
|
||||
val averageUR = beatmapService.getAverageUR(beatmapId = beatmapId, excludeReplayId = replayId)
|
||||
val hitDistribution = this.getHitDistribution(scoreId = result.get(SCORES.ID, Int::class.java))
|
||||
|
||||
val replayData = ReplayData(
|
||||
replay_id = replayId,
|
||||
user_id = result.get(SCORES.USER_ID, Int::class.java),
|
||||
username = result.get(USERS.USERNAME, String::class.java),
|
||||
date = Format.formatLocalDateTime(result.get(SCORES.DATE, LocalDateTime::class.java)),
|
||||
beatmap_id = beatmapId,
|
||||
beatmap_beatmapset_id = result.get(BEATMAPS.BEATMAPSET_ID, Int::class.java),
|
||||
beatmap_artist = result.get(BEATMAPS.ARTIST, String::class.java),
|
||||
beatmap_title = result.get(BEATMAPS.TITLE, String::class.java),
|
||||
beatmap_star_rating = result.get(BEATMAPS.STAR_RATING, Double::class.java),
|
||||
beatmap_creator = result.get(BEATMAPS.CREATOR, String::class.java),
|
||||
beatmap_version = result.get(BEATMAPS.VERSION, String::class.java),
|
||||
pp = result.get(SCORES.PP, Double::class.java),
|
||||
frametime = result.get(SCORES.FRAMETIME, Double::class.java).toInt(),
|
||||
ur = result.get(SCORES.UR, Double::class.java),
|
||||
adjusted_ur = result.get(SCORES.ADJUSTED_UR, Double::class.java),
|
||||
score = result.get(SCORES.SCORE, Int::class.java),
|
||||
mods = Mod.parseModCombination(result.get(SCORES.MODS, Int::class.java)),
|
||||
rank = result.get(SCORES.RANK, String::class.java),
|
||||
average_ur = averageUR,
|
||||
snaps = result.get(SCORES.SNAPS, Int::class.java),
|
||||
hits = result.get(SCORES.EDGE_HITS, Int::class.java),
|
||||
perfect = result.get(SCORES.PERFECT, Boolean::class.java),
|
||||
max_combo = result.get(SCORES.MAX_COMBO, Int::class.java),
|
||||
count_300 = result.get(SCORES.COUNT_300, Int::class.java),
|
||||
count_100 = result.get(SCORES.COUNT_100, Int::class.java),
|
||||
count_50 = result.get(SCORES.COUNT_50, Int::class.java),
|
||||
count_miss = result.get(SCORES.COUNT_MISS, Int::class.java),
|
||||
error_distribution = hitDistribution,
|
||||
mean_error = result.get(SCORES.MEAN_ERROR, Double::class.java),
|
||||
error_variance = result.get(SCORES.ERROR_VARIANCE, Double::class.java),
|
||||
error_standard_deviation = result.get(SCORES.ERROR_STANDARD_DEVIATION, Double::class.java),
|
||||
minimum_error = result.get(SCORES.MINIMUM_ERROR, Double::class.java),
|
||||
maximum_error = result.get(SCORES.MAXIMUM_ERROR, Double::class.java),
|
||||
error_range = result.get(SCORES.ERROR_RANGE, Double::class.java),
|
||||
error_coefficient_of_variation = result.get(SCORES.ERROR_COEFFICIENT_OF_VARIATION, Double::class.java),
|
||||
error_kurtosis = result.get(SCORES.ERROR_KURTOSIS, Double::class.java),
|
||||
error_skewness = result.get(SCORES.ERROR_SKEWNESS, Double::class.java),
|
||||
)
|
||||
this.loadComparableReplayData(replayData)
|
||||
return replayData
|
||||
}
|
||||
|
||||
fun getDefaultCondition(): Condition {
|
||||
return SCORES.UR.lessOrEqual(25.0)
|
||||
.and(SCORES.IS_BANNED.eq(false))
|
||||
}
|
||||
|
||||
fun getSuspiciousScores(condition: Condition = getDefaultCondition()): List<SuspiciousScoreEntry> {
|
||||
val result = dslContext.select(DSL.asterisk())
|
||||
.from(SCORES)
|
||||
.join(USERS).on(SCORES.USER_ID.eq(USERS.USER_ID))
|
||||
.join(BEATMAPS).on(SCORES.BEATMAP_ID.eq(BEATMAPS.BEATMAP_ID))
|
||||
.where(condition)
|
||||
.orderBy(SCORES.DATE.desc())
|
||||
.fetch()
|
||||
|
||||
val response = result.map {
|
||||
|
||||
SuspiciousScoreEntry(
|
||||
user_id = it.get(SCORES.USER_ID, Long::class.java),
|
||||
username = it.get(USERS.USERNAME, String::class.java),
|
||||
replay_id = it.get(SCORES.REPLAY_ID, Long::class.java),
|
||||
date = Format.formatLocalDateTime(it.get(SCORES.DATE, LocalDateTime::class.java)),
|
||||
beatmap_id = it.get(BEATMAPS.BEATMAP_ID, Long::class.java),
|
||||
beatmap_beatmapset_id = it.get(BEATMAPS.BEATMAPSET_ID, Long::class.java),
|
||||
beatmap_title = it.get(BEATMAPS.TITLE, String::class.java),
|
||||
beatmap_star_rating = it.get(BEATMAPS.STAR_RATING, Double::class.java),
|
||||
pp = it.get(SCORES.PP, Double::class.java),
|
||||
frametime = it.get(SCORES.FRAMETIME, Double::class.java),
|
||||
ur = it.get(SCORES.UR, Double::class.java)
|
||||
)
|
||||
|
||||
}
|
||||
|
||||
return response
|
||||
}
|
||||
|
||||
fun getPairInfo(replay1Id: Long, replay2Id: Long): ReplayPairStatistics? {
|
||||
// Attempt to fetch in the order (replay1Id, replay2Id)
|
||||
var pairStatistics = dslContext.select(SCORES_SIMILARITY.SIMILARITY, SCORES_SIMILARITY.CORRELATION)
|
||||
.from(SCORES_SIMILARITY)
|
||||
.where(
|
||||
SCORES_SIMILARITY.REPLAY_ID_1.eq(replay1Id)
|
||||
.and(SCORES_SIMILARITY.REPLAY_ID_2.eq(replay2Id))
|
||||
)
|
||||
.fetchOne()
|
||||
|
||||
// If no result, attempt to fetch in the order (replay2Id, replay1Id)
|
||||
if (pairStatistics == null) {
|
||||
pairStatistics = dslContext.select(SCORES_SIMILARITY.SIMILARITY, SCORES_SIMILARITY.CORRELATION)
|
||||
.from(SCORES_SIMILARITY)
|
||||
.where(
|
||||
SCORES_SIMILARITY.REPLAY_ID_1.eq(replay2Id)
|
||||
.and(SCORES_SIMILARITY.REPLAY_ID_2.eq(replay1Id))
|
||||
)
|
||||
.fetchOne()
|
||||
}
|
||||
|
||||
// Return null if no results found in both attempts
|
||||
if (pairStatistics == null) return null
|
||||
|
||||
// Return the found statistics
|
||||
return ReplayPairStatistics(
|
||||
similarity = pairStatistics.get(SCORES_SIMILARITY.SIMILARITY, Double::class.java),
|
||||
correlation = pairStatistics.get(SCORES_SIMILARITY.CORRELATION, Double::class.java)
|
||||
)
|
||||
}
|
||||
|
||||
fun getSimilarReplaysRecords(condition: Condition = DSL.noCondition(), includeBanned: Boolean = false): Result<Record> {
|
||||
return dslContext
|
||||
.select()
|
||||
.from(SCORES_SIMILARITY)
|
||||
.join(osuScoreAlias1).on(osuScoreAlias1.REPLAY_ID.eq(SCORES_SIMILARITY.REPLAY_ID_1))
|
||||
.join(osuUserAlias1).on(osuScoreAlias1.USER_ID.eq(osuUserAlias1.USER_ID))
|
||||
.leftJoin(osuScoreAlias2).on(osuScoreAlias2.REPLAY_ID.eq(SCORES_SIMILARITY.REPLAY_ID_2))
|
||||
.leftJoin(osuUserAlias2).on(osuScoreAlias2.USER_ID.eq(osuUserAlias2.USER_ID))
|
||||
.join(BEATMAPS).on(BEATMAPS.BEATMAP_ID.eq(SCORES_SIMILARITY.BEATMAP_ID))
|
||||
.where(SCORES_SIMILARITY.SIMILARITY.lt(10.0))
|
||||
.apply {
|
||||
if (!includeBanned) {
|
||||
and(osuScoreAlias1.IS_BANNED.eq(false))
|
||||
and(osuScoreAlias2.IS_BANNED.eq(false))
|
||||
}
|
||||
}
|
||||
.and(condition)
|
||||
.orderBy(osuScoreAlias2.DATE.desc(), SCORES_SIMILARITY.SIMILARITY.asc())
|
||||
.fetch()
|
||||
}
|
||||
|
||||
fun getSimilarReplaysForUserId(userId: Long): List<SimilarReplayEntry> {
|
||||
val condition = osuScoreAlias1.USER_ID.eq(userId).or(osuScoreAlias2.USER_ID.eq(userId))
|
||||
val replays = getSimilarReplaysRecords(condition, includeBanned = true)
|
||||
|
||||
// We only return replays where the user is the second user in the pair
|
||||
// The second user is the one who has the most recent replay
|
||||
// Which means it's *probably* the cheater.
|
||||
return mapSimilarReplays(replays)
|
||||
.filter { it.user_id_2 == userId }
|
||||
}
|
||||
|
||||
fun getSimilarReplays(condition: Condition = DSL.noCondition()): List<SimilarReplayEntry> {
|
||||
val replays = getSimilarReplaysRecords(condition)
|
||||
return mapSimilarReplays(replays)
|
||||
}
|
||||
|
||||
private fun mapSimilarReplays(replays: Result<Record>) = replays.map {
|
||||
// Extract necessary fields
|
||||
var replayId1 = it.get(osuScoreAlias1.REPLAY_ID, Long::class.java)
|
||||
var replayId2 = it.get(osuScoreAlias2.REPLAY_ID, Long::class.java)
|
||||
|
||||
var userId1 = it.get(osuScoreAlias1.USER_ID, Long::class.java)
|
||||
var userId2 = it.get(osuScoreAlias2.USER_ID, Long::class.java)
|
||||
|
||||
var username1 = it.get(osuUserAlias1.USERNAME, String::class.java)
|
||||
var username2 = it.get(osuUserAlias2.USERNAME, String::class.java)
|
||||
|
||||
var replayDate1 = it.get(osuScoreAlias1.DATE, LocalDateTime::class.java)
|
||||
var replayDate2 = it.get(osuScoreAlias2.DATE, LocalDateTime::class.java)
|
||||
|
||||
var replayPp1 = it.get(osuScoreAlias1.PP, Double::class.java)
|
||||
var replayPp2 = it.get(osuScoreAlias2.PP, Double::class.java)
|
||||
|
||||
// Swap logic if replayDate1 is after replayDate2
|
||||
if (replayDate1.isAfter(replayDate2)) {
|
||||
val tempReplayId = replayId1
|
||||
replayId1 = replayId2
|
||||
replayId2 = tempReplayId
|
||||
|
||||
val tempUserId = userId1
|
||||
userId1 = userId2
|
||||
userId2 = tempUserId
|
||||
|
||||
val tempUsername = username1
|
||||
username1 = username2
|
||||
username2 = tempUsername
|
||||
|
||||
val tempReplayDate = replayDate1
|
||||
replayDate1 = replayDate2
|
||||
replayDate2 = tempReplayDate
|
||||
|
||||
val tempReplayPp = replayPp1
|
||||
replayPp1 = replayPp2
|
||||
replayPp2 = tempReplayPp
|
||||
}
|
||||
|
||||
SimilarReplayEntry(
|
||||
replay_id_1 = replayId1,
|
||||
replay_id_2 = replayId2,
|
||||
user_id_1 = userId1,
|
||||
user_id_2 = userId2,
|
||||
username_1 = username1,
|
||||
username_2 = username2,
|
||||
beatmap_beatmapset_id = it.get(BEATMAPS.BEATMAPSET_ID, Long::class.java),
|
||||
replay_date_1 = Format.formatLocalDateTime(replayDate1),
|
||||
replay_date_2 = Format.formatLocalDateTime(replayDate2),
|
||||
replay_pp_1 = replayPp1,
|
||||
replay_pp_2 = replayPp2,
|
||||
beatmap_id = it.get(BEATMAPS.BEATMAP_ID, Long::class.java),
|
||||
beatmap_title = it.get(BEATMAPS.TITLE, String::class.java),
|
||||
beatmap_star_rating = it.get(BEATMAPS.STAR_RATING, Double::class.java),
|
||||
similarity = it.get(SCORES_SIMILARITY.SIMILARITY, Double::class.java)
|
||||
)
|
||||
}.distinctBy {
|
||||
val (smallerId, largerId) = listOf(it.replay_id_1, it.replay_id_2).sorted()
|
||||
"$smallerId-$largerId"
|
||||
}.sortedWith(compareBy({ it.replay_date_2 }, { it.similarity })).reversed()
|
||||
|
||||
fun loadComparableReplayData(replayData: ReplayData) {
|
||||
// Total samples
|
||||
val totalSamples = dslContext.fetchCount(
|
||||
SCORES, SCORES.BEATMAP_ID.eq(replayData.beatmap_id).and(SCORES.REPLAY_ID.notEqual(replayData.replay_id))
|
||||
)
|
||||
|
||||
if(totalSamples <= 0) {
|
||||
return
|
||||
}
|
||||
|
||||
// We will select same beatmap_id and same mods
|
||||
val otherScores = dslContext.select(
|
||||
avg(SCORES.MEAN_ERROR).`as`("avg_mean_error"),
|
||||
avg(SCORES.ERROR_VARIANCE).`as`("avg_error_variance"),
|
||||
avg(SCORES.ERROR_STANDARD_DEVIATION).`as`("avg_error_standard_deviation"),
|
||||
avg(SCORES.MINIMUM_ERROR).`as`("avg_minimum_error"),
|
||||
avg(SCORES.MAXIMUM_ERROR).`as`("avg_maximum_error"),
|
||||
avg(SCORES.ERROR_RANGE).`as`("avg_error_range"),
|
||||
avg(SCORES.ERROR_COEFFICIENT_OF_VARIATION).`as`("avg_error_coefficient_of_variation"),
|
||||
avg(SCORES.ERROR_KURTOSIS).`as`("avg_error_kurtosis"),
|
||||
avg(SCORES.ERROR_SKEWNESS).`as`("avg_error_skewness")
|
||||
)
|
||||
.from(SCORES)
|
||||
.where(SCORES.BEATMAP_ID.eq(replayData.beatmap_id))
|
||||
.and(SCORES.REPLAY_ID.notEqual(replayData.replay_id))
|
||||
.fetchOne() ?: return
|
||||
|
||||
replayData.comparable_samples = totalSamples
|
||||
|
||||
replayData.comparable_mean_error = otherScores.get("avg_mean_error", Double::class.java)
|
||||
replayData.comparable_error_variance = otherScores.get("avg_error_variance", Double::class.java)
|
||||
replayData.comparable_error_standard_deviation = otherScores.get("avg_error_standard_deviation", Double::class.java)
|
||||
replayData.comparable_minimum_error = otherScores.get("avg_minimum_error", Double::class.java)
|
||||
replayData.comparable_maximum_error = otherScores.get("avg_maximum_error", Double::class.java)
|
||||
replayData.comparable_error_range = otherScores.get("avg_error_range", Double::class.java)
|
||||
replayData.comparable_error_coefficient_of_variation = otherScores.get("avg_error_coefficient_of_variation", Double::class.java)
|
||||
replayData.comparable_error_kurtosis = otherScores.get("avg_error_kurtosis", Double::class.java)
|
||||
replayData.comparable_error_skewness = otherScores.get("avg_error_skewness", Double::class.java)
|
||||
}
|
||||
|
||||
fun getHitDistribution(scoreId: Int): Map<Int, DistributionEntry> {
|
||||
val judgements = dslContext.selectFrom(SCORES_JUDGEMENTS)
|
||||
.where(SCORES_JUDGEMENTS.SCORE_ID.eq(scoreId))
|
||||
.fetchInto(ScoresJudgementsRecord::class.java)
|
||||
|
||||
val errorDistribution = mutableMapOf<Int, MutableMap<String, Int>>()
|
||||
var totalHits = 0
|
||||
|
||||
judgements.forEach { hit ->
|
||||
val error = (hit.error!!.roundToInt() / 2) * 2
|
||||
val judgementType = hit.type // Assuming this is how you get the judgement type
|
||||
errorDistribution.getOrPut(error) { mutableMapOf("Miss" to 0, "300" to 0, "100" to 0, "50" to 0) }
|
||||
.apply {
|
||||
this[judgementType.toString()] = this.getOrDefault(judgementType.toString(), 0) + 1
|
||||
}
|
||||
totalHits += 1
|
||||
}
|
||||
|
||||
return errorDistribution.mapValues { (_, judgementCounts) ->
|
||||
judgementCounts.values.sum()
|
||||
DistributionEntry(
|
||||
percentageMiss = (judgementCounts.getOrDefault("Miss", 0).toDouble() / totalHits) * 100,
|
||||
percentage300 = (judgementCounts.getOrDefault("300", 0).toDouble() / totalHits) * 100,
|
||||
percentage100 = (judgementCounts.getOrDefault("100", 0).toDouble() / totalHits) * 100,
|
||||
percentage50 = (judgementCounts.getOrDefault("50", 0).toDouble() / totalHits) * 100
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,24 @@
|
||||
package com.nisemoe.nise.database
|
||||
|
||||
import com.nisemoe.generated.tables.references.BEATMAPS
|
||||
import com.nisemoe.generated.tables.references.SCORES
|
||||
import com.nisemoe.generated.tables.references.SCORES_SIMILARITY
|
||||
import com.nisemoe.generated.tables.references.USERS
|
||||
import com.nisemoe.nise.Statistics
|
||||
import org.jooq.DSLContext
|
||||
import org.springframework.stereotype.Service
|
||||
|
||||
@Service
|
||||
class StatisticsService(private val dslContext: DSLContext) {
|
||||
|
||||
fun getStatistics(): Statistics {
|
||||
return Statistics(
|
||||
dslContext.fetchCount(BEATMAPS),
|
||||
dslContext.fetchCount(USERS),
|
||||
dslContext.fetchCount(SCORES),
|
||||
dslContext.fetchCount(SCORES, SCORES.REPLAY.isNotNull),
|
||||
dslContext.fetchCount(SCORES_SIMILARITY)
|
||||
)
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,129 @@
|
||||
package com.nisemoe.nise.database
|
||||
|
||||
import com.nisemoe.generated.tables.records.UsersRecord
|
||||
import com.nisemoe.generated.tables.references.USERS
|
||||
import com.nisemoe.nise.osu.OsuApi
|
||||
import com.nisemoe.nise.osu.OsuApiModels
|
||||
import com.nisemoe.nise.Format
|
||||
import com.nisemoe.nise.UserDetails
|
||||
import org.jooq.DSLContext
|
||||
import org.slf4j.LoggerFactory
|
||||
import org.springframework.stereotype.Service
|
||||
import java.time.LocalDateTime
|
||||
import java.time.OffsetDateTime
|
||||
|
||||
@Service
|
||||
class UserService(
|
||||
private val dslContext: DSLContext,
|
||||
private val osuApi: OsuApi
|
||||
) {
|
||||
|
||||
private val logger = LoggerFactory.getLogger(javaClass)
|
||||
|
||||
fun mapUserToDatabase(dto: OsuApiModels.UserExtended): UserDetails {
|
||||
|
||||
return UserDetails(
|
||||
user_id = dto.id,
|
||||
username = dto.username,
|
||||
rank = dto.statistics?.global_rank,
|
||||
pp_raw = dto.statistics?.pp,
|
||||
join_date = if (dto.join_date != null) Format.formatLocalDateTime(OffsetDateTime.parse(dto.join_date).toLocalDateTime()) else null,
|
||||
seconds_played = dto.statistics?.play_time,
|
||||
country = dto.country?.code,
|
||||
country_rank = dto.statistics?.country_rank,
|
||||
playcount = dto.statistics?.play_count
|
||||
)
|
||||
|
||||
}
|
||||
|
||||
fun getUserDetails(username: String): UserDetails? {
|
||||
val user = dslContext.selectFrom(USERS)
|
||||
.where(USERS.USERNAME.equalIgnoreCase(username.lowercase()))
|
||||
.fetchOneInto(UsersRecord::class.java)
|
||||
|
||||
if (user != null) {
|
||||
return UserDetails(
|
||||
user.userId!!,
|
||||
user.username!!,
|
||||
user.rank,
|
||||
user.ppRaw,
|
||||
user.joinDate?.let { Format.formatLocalDateTime(it) },
|
||||
user.secondsPlayed,
|
||||
user.country,
|
||||
user.countryRank,
|
||||
user.playcount
|
||||
)
|
||||
}
|
||||
|
||||
// The database does NOT have the user; we will now use the osu!api
|
||||
val apiUser = this.osuApi.getUserProfile(userId = username, mode = "osu", key = "username")
|
||||
?: return null
|
||||
|
||||
// Persist to database
|
||||
insertApiUser(apiUser)
|
||||
|
||||
return this.mapUserToDatabase(apiUser)
|
||||
}
|
||||
|
||||
fun insertApiUser(apiUser: OsuApiModels.UserExtended) {
|
||||
this.logger.debug("Saving user ${apiUser.username}")
|
||||
|
||||
if(dslContext.fetchExists(USERS, USERS.USER_ID.eq(apiUser.id))) {
|
||||
|
||||
dslContext.update(USERS)
|
||||
.set(USERS.USERNAME, apiUser.username)
|
||||
.set(USERS.RANK, apiUser.statistics?.global_rank)
|
||||
.set(USERS.PP_RAW, apiUser.statistics?.pp)
|
||||
.set(USERS.ACCURACY, apiUser.statistics?.hit_accuracy)
|
||||
.set(USERS.TOTAL_SCORE, apiUser.statistics?.total_score)
|
||||
.set(USERS.RANKED_SCORE, apiUser.statistics?.ranked_score)
|
||||
.set(USERS.COUNT_300, apiUser.statistics?.count_300)
|
||||
.set(USERS.COUNT_100, apiUser.statistics?.count_100)
|
||||
.set(USERS.COUNT_50, apiUser.statistics?.count_50)
|
||||
.apply {
|
||||
if(apiUser.join_date != null) {
|
||||
set(USERS.JOIN_DATE, OffsetDateTime.parse(apiUser.join_date).toLocalDateTime())
|
||||
}
|
||||
}
|
||||
.set(USERS.SECONDS_PLAYED, apiUser.statistics?.play_time)
|
||||
.set(USERS.COUNTRY, apiUser.country?.code)
|
||||
.set(USERS.COUNTRY_RANK, apiUser.statistics?.country_rank)
|
||||
.set(USERS.PLAYCOUNT, apiUser.statistics?.play_count)
|
||||
.set(USERS.SYS_LAST_UPDATE, LocalDateTime.now())
|
||||
.where(USERS.USER_ID.eq(apiUser.id))
|
||||
.execute()
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
val affectedRows = dslContext.insertInto(USERS)
|
||||
.set(USERS.USER_ID, apiUser.id)
|
||||
.set(USERS.USERNAME, apiUser.username)
|
||||
.set(USERS.RANK, apiUser.statistics?.global_rank)
|
||||
.set(USERS.PP_RAW, apiUser.statistics?.pp)
|
||||
.set(USERS.ACCURACY, apiUser.statistics?.hit_accuracy)
|
||||
.set(USERS.TOTAL_SCORE, apiUser.statistics?.total_score)
|
||||
.set(USERS.RANKED_SCORE, apiUser.statistics?.ranked_score)
|
||||
.set(USERS.COUNT_300, apiUser.statistics?.count_300)
|
||||
.set(USERS.COUNT_100, apiUser.statistics?.count_100)
|
||||
.set(USERS.COUNT_50, apiUser.statistics?.count_50)
|
||||
.apply {
|
||||
if(apiUser.join_date != null) {
|
||||
set(USERS.JOIN_DATE, OffsetDateTime.parse(apiUser.join_date).toLocalDateTime())
|
||||
}
|
||||
}
|
||||
.set(USERS.SECONDS_PLAYED, apiUser.statistics?.play_time)
|
||||
.set(USERS.COUNTRY, apiUser.country?.code)
|
||||
.set(USERS.COUNTRY_RANK, apiUser.statistics?.country_rank)
|
||||
.set(USERS.PLAYCOUNT, apiUser.statistics?.play_count)
|
||||
.set(USERS.SYS_LAST_UPDATE, LocalDateTime.now())
|
||||
.onDuplicateKeyIgnore()
|
||||
.execute()
|
||||
|
||||
if (affectedRows == 0) {
|
||||
this.logger.error("Tried to insert ${apiUser.username} but failed.")
|
||||
this.logger.error(apiUser.toString())
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,172 @@
|
||||
package com.nisemoe.nise.integrations
|
||||
|
||||
import com.nisemoe.nise.osu.Mod
|
||||
import com.nisemoe.nise.scheduler.ImportScores
|
||||
import kotlinx.serialization.SerialName
|
||||
import kotlinx.serialization.Serializable
|
||||
import kotlinx.serialization.json.Json
|
||||
import org.springframework.beans.factory.annotation.Value
|
||||
import org.springframework.stereotype.Service
|
||||
import java.net.URI
|
||||
import java.net.http.HttpClient
|
||||
import java.net.http.HttpRequest
|
||||
import java.net.http.HttpResponse
|
||||
import java.util.concurrent.CompletableFuture
|
||||
|
||||
@Service
|
||||
class CircleguardService {
|
||||
|
||||
@Value("\${CIRCLEGUARD_API_URL:http://localhost:5000}")
|
||||
private var apiUrl: String = "http://localhost:5000"
|
||||
|
||||
private val httpClient: HttpClient = HttpClient.newBuilder()
|
||||
.version(HttpClient.Version.HTTP_2)
|
||||
.build()
|
||||
|
||||
private val serializer = Json
|
||||
|
||||
@Serializable
|
||||
data class ReplayRequest(
|
||||
val replay_data: String,
|
||||
val mods: Int,
|
||||
val beatmap_id: Int
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class ScoreJudgement(
|
||||
val time: Double,
|
||||
val x: Double,
|
||||
val y: Double,
|
||||
val type: JudgementType,
|
||||
val distance_center: Double,
|
||||
val distance_edge: Double,
|
||||
val error: Double
|
||||
)
|
||||
|
||||
@Serializable
|
||||
enum class JudgementType {
|
||||
@SerialName("Hit300")
|
||||
THREE_HUNDRED,
|
||||
|
||||
@SerialName("Hit100")
|
||||
ONE_HUNDRED,
|
||||
|
||||
@SerialName("Hit50")
|
||||
FIFTY,
|
||||
|
||||
@SerialName("Miss")
|
||||
MISS
|
||||
}
|
||||
|
||||
@Serializable
|
||||
data class ReplayResponse(
|
||||
val ur: Double?,
|
||||
val adjusted_ur: Double?,
|
||||
val frametime: Double?,
|
||||
val edge_hits: Int?,
|
||||
val snaps: Int?,
|
||||
var mean_error: Double?,
|
||||
var error_variance: Double?,
|
||||
var error_standard_deviation: Double?,
|
||||
var minimum_error: Double?,
|
||||
var maximum_error: Double?,
|
||||
var error_range: Double?,
|
||||
var error_coefficient_of_variation: Double?,
|
||||
var error_kurtosis: Double?,
|
||||
var error_skewness: Double?,
|
||||
val judgements: List<ScoreJudgement>
|
||||
)
|
||||
|
||||
fun postProcessReplay(replayResponse: ReplayResponse, mods: Int = 0) {
|
||||
var conversionFactor = 1.0
|
||||
if(Mod.containsMod(mods, Mod.DT)) {
|
||||
conversionFactor = (1 / 1.5)
|
||||
} else if(Mod.containsMod(mods, Mod.HT)) {
|
||||
conversionFactor = (1 / 0.75)
|
||||
}
|
||||
|
||||
replayResponse.mean_error = replayResponse.mean_error?.times(conversionFactor)
|
||||
replayResponse.error_variance = replayResponse.error_variance?.times(conversionFactor)
|
||||
replayResponse.error_standard_deviation = replayResponse.error_standard_deviation?.times(conversionFactor)
|
||||
replayResponse.error_coefficient_of_variation = replayResponse.error_coefficient_of_variation?.times(conversionFactor)
|
||||
replayResponse.error_kurtosis = replayResponse.error_kurtosis?.times(conversionFactor)
|
||||
replayResponse.error_skewness = replayResponse.error_skewness?.times(conversionFactor)
|
||||
}
|
||||
|
||||
fun processReplay(replayData: String, beatmapId: Int, mods: Int = 0): CompletableFuture<ReplayResponse> {
|
||||
val requestUri = "$apiUrl/replay"
|
||||
|
||||
val request = ReplayRequest(
|
||||
replay_data = replayData,
|
||||
mods = mods,
|
||||
beatmap_id = beatmapId
|
||||
)
|
||||
|
||||
// Serialize the request object to JSON
|
||||
val requestBody = serializer.encodeToString(ReplayRequest.serializer(), request)
|
||||
|
||||
val httpRequest = HttpRequest.newBuilder()
|
||||
.uri(URI.create(requestUri))
|
||||
.header("Content-Type", "application/json") // Set content type to application/json
|
||||
.POST(HttpRequest.BodyPublishers.ofString(requestBody))
|
||||
.build()
|
||||
|
||||
return httpClient.sendAsync(httpRequest, HttpResponse.BodyHandlers.ofString())
|
||||
.thenApply { response: HttpResponse<String> ->
|
||||
if (response.statusCode() == 200) {
|
||||
// Deserialize the JSON response to ReplayResponse object
|
||||
val decodedReplay = serializer.decodeFromString(ReplayResponse.serializer(), response.body())
|
||||
postProcessReplay(decodedReplay, mods)
|
||||
decodedReplay
|
||||
} else {
|
||||
throw RuntimeException("Failed to process replay: ${response.body()}")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@Serializable
|
||||
data class SimilarityRequest(
|
||||
val replays: List<ImportScores.ReplayDto>,
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class SimilarityResponse(
|
||||
val result: List<SimilarityResponseEntry>
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class SimilarityResponseEntry(
|
||||
val replay_id_1: Long,
|
||||
val replay_id_2: Long,
|
||||
|
||||
val similarity: Double,
|
||||
val correlation: Double
|
||||
)
|
||||
|
||||
fun processSimilarity(replays: List<ImportScores.ReplayDto>): CompletableFuture<SimilarityResponse> {
|
||||
val requestUri = "$apiUrl/similarity"
|
||||
|
||||
val request = SimilarityRequest(
|
||||
replays = replays
|
||||
)
|
||||
|
||||
// Serialize the request object to JSON
|
||||
val requestBody = serializer.encodeToString(SimilarityRequest.serializer(), request)
|
||||
|
||||
val httpRequest = HttpRequest.newBuilder()
|
||||
.uri(URI.create(requestUri))
|
||||
.header("Content-Type", "application/json")
|
||||
.POST(HttpRequest.BodyPublishers.ofString(requestBody))
|
||||
.build()
|
||||
|
||||
return httpClient.sendAsync(httpRequest, HttpResponse.BodyHandlers.ofString())
|
||||
.thenApply { response: HttpResponse<String> ->
|
||||
if (response.statusCode() == 200) {
|
||||
serializer.decodeFromString(SimilarityResponse.serializer(), response.body())
|
||||
} else {
|
||||
throw RuntimeException("Failed to process replay: ${response.body()}")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,102 @@
|
||||
package com.nisemoe.nise.integrations;
|
||||
|
||||
import kotlinx.serialization.Serializable
|
||||
import kotlinx.serialization.encodeToString
|
||||
import kotlinx.serialization.json.Json
|
||||
import kotlinx.serialization.json.buildJsonArray
|
||||
import kotlinx.serialization.json.buildJsonObject
|
||||
import kotlinx.serialization.json.encodeToJsonElement
|
||||
import org.slf4j.LoggerFactory
|
||||
import org.springframework.stereotype.Service
|
||||
import java.net.URI
|
||||
import java.net.http.HttpClient
|
||||
import java.net.http.HttpRequest
|
||||
import java.net.http.HttpResponse
|
||||
import java.time.ZonedDateTime
|
||||
import java.time.format.DateTimeFormatter
|
||||
|
||||
/**
|
||||
* Docs @ https://discord.com/developers/docs/resources/channel#embed-object
|
||||
*/
|
||||
@Serializable
|
||||
data class DiscordEmbed(
|
||||
var title: String? = null,
|
||||
var description: String? = null,
|
||||
var url: String? = null,
|
||||
var timestamp: String? = null,
|
||||
var color: Int? = null,
|
||||
@Serializable
|
||||
var fields: MutableList<DiscordEmbedField> = ArrayList(),
|
||||
|
||||
var author: DiscordEmbedAuthor? = null,
|
||||
var image: DiscordEmbedImage? = null
|
||||
) {
|
||||
|
||||
fun setTimestamp() {
|
||||
val dateTimeFormatter = DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.MSSZ")
|
||||
this.timestamp = ZonedDateTime.now().format(dateTimeFormatter)
|
||||
}
|
||||
|
||||
fun setImageUrl(url: String) {
|
||||
this.image = DiscordEmbedImage(url)
|
||||
}
|
||||
|
||||
fun setAuthor(name: String, url: String? = null, iconUrl: String? = null) {
|
||||
this.author = DiscordEmbedAuthor(name, url, iconUrl)
|
||||
}
|
||||
|
||||
fun addEmbed(name: String, value: String, inline: Boolean = true) {
|
||||
this.fields.add(DiscordEmbedField(name, value, inline))
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
@Serializable
|
||||
data class DiscordEmbedImage (
|
||||
val url: String
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class DiscordEmbedAuthor(
|
||||
val name: String,
|
||||
val url: String? = null,
|
||||
val icon_url: String? = null
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class DiscordEmbedField(
|
||||
val name: String,
|
||||
val value: String,
|
||||
val inline: Boolean
|
||||
)
|
||||
|
||||
@Service
|
||||
class DiscordService {
|
||||
|
||||
private val logger = LoggerFactory.getLogger(javaClass)
|
||||
|
||||
private val httpClient: HttpClient = HttpClient.newBuilder()
|
||||
.version(HttpClient.Version.HTTP_1_1)
|
||||
.build()
|
||||
|
||||
fun sendEmbeds(webhookUrl: String, embeds: List<DiscordEmbed>): Boolean {
|
||||
logger.debug("Sending Discord webhook with embed")
|
||||
|
||||
val requestBody = buildJsonObject {
|
||||
put("embeds", buildJsonArray {
|
||||
embeds.forEach { add(Json.encodeToJsonElement(it)) }
|
||||
})
|
||||
}
|
||||
|
||||
val request = HttpRequest.newBuilder()
|
||||
.uri(URI.create(webhookUrl))
|
||||
.POST(HttpRequest.BodyPublishers.ofString(Json.encodeToString(requestBody)))
|
||||
.header("Content-type", "application/json")
|
||||
.build()
|
||||
|
||||
val response = httpClient.send(request, HttpResponse.BodyHandlers.ofString())
|
||||
return response.statusCode() == 204
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
108
nise-backend/src/main/kotlin/com/nisemoe/nise/osu/Mod.kt
Normal file
108
nise-backend/src/main/kotlin/com/nisemoe/nise/osu/Mod.kt
Normal file
@ -0,0 +1,108 @@
|
||||
package com.nisemoe.nise.osu
|
||||
|
||||
enum class Mod(val value: Int) {
|
||||
NM(0), // NoMod
|
||||
NF(1 shl 0), // NoFail
|
||||
EZ(1 shl 1), // Easy
|
||||
TD(1 shl 2), // TouchDevice
|
||||
HD(1 shl 3), // Hidden
|
||||
HR(1 shl 4), // HardRock
|
||||
SD(1 shl 5), // SuddenDeath
|
||||
DT(1 shl 6), // DoubleTime
|
||||
RX(1 shl 7), // Relax
|
||||
HT(1 shl 8), // HalfTime
|
||||
_NC(1 shl 9), // _Nightcore, technical version
|
||||
NC(_NC.value or DT.value), // Nightcore, as defined in-game
|
||||
FL(1 shl 10), // Flashlight
|
||||
AT(1 shl 11), // Autoplay
|
||||
SO(1 shl 12), // SpunOut
|
||||
AP(1 shl 13), // Autopilot
|
||||
_PF(1 shl 14), // _Perfect, technical version
|
||||
PF(_PF.value or SD.value), // Perfect, as defined in-game
|
||||
K4(1 shl 15), // Key4
|
||||
K5(1 shl 16), // Key5
|
||||
K6(1 shl 17), // Key6
|
||||
K7(1 shl 18), // Key7
|
||||
K8(1 shl 19), // Key8
|
||||
FI(1 shl 20), // FadeIn
|
||||
RD(1 shl 21), // Random
|
||||
CN(1 shl 22), // Cinema
|
||||
TP(1 shl 23), // Target
|
||||
K9(1 shl 24), // Key9
|
||||
CO(1 shl 25), // KeyCoop
|
||||
K1(1 shl 26), // Key1
|
||||
K3(1 shl 27), // Key3
|
||||
K2(1 shl 28), // Key2
|
||||
V2(1 shl 29), // ScoreV2
|
||||
MR(1 shl 30); // Mirror
|
||||
|
||||
companion object {
|
||||
|
||||
private val ORDER = listOf(
|
||||
NM, EZ, HD, HT, DT, NC, HR, FL, NF, SD, PF, RX, AP, SO, AT,
|
||||
V2, TD, // we stop caring about order after this point
|
||||
FI, RD, CN, TP, K1, K2, K3, K4, K5, K6, K7, K8, K9, CO, MR
|
||||
)
|
||||
|
||||
private fun orderMods(mods: Set<Mod>): List<Mod> {
|
||||
val modsMutable = mods.toMutableSet()
|
||||
|
||||
// Replace _NC with NC if present
|
||||
if (_NC in modsMutable) {
|
||||
modsMutable.remove(_NC)
|
||||
modsMutable.add(NC)
|
||||
}
|
||||
|
||||
// Replace _PF with PF if present
|
||||
if (_PF in modsMutable) {
|
||||
modsMutable.remove(_PF)
|
||||
modsMutable.add(PF)
|
||||
}
|
||||
|
||||
val orderedMods = ORDER.filter { it in modsMutable }.toMutableList()
|
||||
|
||||
// Additional logic for removing DT when _NC is replaced by NC
|
||||
// and removing SD when _PF is replaced by PF, if necessary
|
||||
if (NC in orderedMods && DT in orderedMods) {
|
||||
orderedMods -= DT
|
||||
}
|
||||
if (PF in orderedMods && SD in orderedMods) {
|
||||
orderedMods -= SD
|
||||
}
|
||||
if(NM in orderedMods) {
|
||||
orderedMods -= NM
|
||||
}
|
||||
|
||||
return orderedMods
|
||||
}
|
||||
|
||||
fun print(combinationValue: Int): String {
|
||||
return parseModCombination(combinationValue).joinToString("")
|
||||
}
|
||||
|
||||
fun parseModCombination(combinationValue: Int): List<String> {
|
||||
// First, determine which mods are present in the combination
|
||||
val modsPresent = entries
|
||||
.filter { mod -> (combinationValue and mod.value) == mod.value }
|
||||
.toSet()
|
||||
|
||||
// Then, order these mods according to the predefined order
|
||||
val orderedMods = orderMods(modsPresent)
|
||||
|
||||
// Finally, return the names of the ordered mods
|
||||
return orderedMods.map { it.name }
|
||||
}
|
||||
|
||||
fun combineModStrings(modStrings: List<String>): Int {
|
||||
return modStrings
|
||||
.mapNotNull { modString -> entries.firstOrNull { it.name == modString }?.value }
|
||||
.fold(0) { acc, modValue -> acc or modValue }
|
||||
}
|
||||
|
||||
fun containsMod(number: Int, mod: Mod): Boolean {
|
||||
return (number and mod.value) == mod.value
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
247
nise-backend/src/main/kotlin/com/nisemoe/nise/osu/OsuApi.kt
Normal file
247
nise-backend/src/main/kotlin/com/nisemoe/nise/osu/OsuApi.kt
Normal file
@ -0,0 +1,247 @@
|
||||
package com.nisemoe.nise.osu
|
||||
|
||||
import kotlinx.serialization.ExperimentalSerializationApi
|
||||
import kotlinx.serialization.builtins.ListSerializer
|
||||
import kotlinx.serialization.json.Json
|
||||
import org.slf4j.LoggerFactory
|
||||
import org.springframework.beans.factory.annotation.Value
|
||||
import org.springframework.stereotype.Service
|
||||
import java.net.URI
|
||||
import java.net.http.HttpClient
|
||||
import java.net.http.HttpRequest
|
||||
import java.net.http.HttpResponse
|
||||
|
||||
@Service
|
||||
class OsuApi(
|
||||
private val tokenService: TokenService
|
||||
) {
|
||||
|
||||
private val logger = LoggerFactory.getLogger(javaClass)
|
||||
|
||||
@Value("\${OSU_API_KEY}")
|
||||
private lateinit var osuApiKey: String
|
||||
|
||||
@OptIn(ExperimentalSerializationApi::class)
|
||||
private val serializer = Json { ignoreUnknownKeys = true; explicitNulls = false }
|
||||
|
||||
private val httpClient: HttpClient = HttpClient.newBuilder()
|
||||
.version(HttpClient.Version.HTTP_2)
|
||||
.build()
|
||||
|
||||
fun doRequest(url: String, queryParams: Map<String, Any>, authorized: Boolean = true, appendToUrl: String? = null): HttpResponse<String>? {
|
||||
var accessToken: TokenService.AccessTokenResponse? = null
|
||||
if(authorized)
|
||||
accessToken = this.tokenService.getAccessToken()
|
||||
|
||||
val uriBuilder = StringBuilder(url)
|
||||
queryParams.forEach { (key, value) ->
|
||||
uriBuilder.append("$key=$value&")
|
||||
}
|
||||
|
||||
if(appendToUrl != null)
|
||||
uriBuilder.append(appendToUrl)
|
||||
|
||||
// Remove the last '&' or '?' if no parameters were appended
|
||||
if (uriBuilder.endsWith('&') || uriBuilder.endsWith('?'))
|
||||
uriBuilder.setLength(uriBuilder.length - 1)
|
||||
|
||||
val request = HttpRequest.newBuilder()
|
||||
.uri(URI.create(uriBuilder.toString()))
|
||||
.header("Accept", "application/json")
|
||||
.header("Content-Type", "application/json")
|
||||
.headers("x-api-version", "20220704")
|
||||
.also { if(authorized) it.header("Authorization", "Bearer ${accessToken!!.access_token}") }
|
||||
.GET()
|
||||
.build()
|
||||
|
||||
return this.sendWithRetry(request)
|
||||
}
|
||||
|
||||
fun getReplay(scoreId: Long): OsuApiModels.ReplayResponse? {
|
||||
val queryParams = mapOf(
|
||||
"k" to this.osuApiKey,
|
||||
"s" to scoreId,
|
||||
"m" to 0 // [osu!std]
|
||||
)
|
||||
val response = this.doRequest("https://osu.ppy.sh/api/get_replay?", queryParams)
|
||||
if(response == null) {
|
||||
this.logger.info("Error loading replay data")
|
||||
return null
|
||||
}
|
||||
|
||||
return if (response.statusCode() == 200) {
|
||||
try {
|
||||
serializer.decodeFromString(OsuApiModels.ReplayResponse.serializer(), response.body())
|
||||
} catch(exception: Exception) {
|
||||
this.logger.error(response.body())
|
||||
this.logger.error(exception.stackTraceToString())
|
||||
return null
|
||||
}
|
||||
} else {
|
||||
null
|
||||
}
|
||||
}
|
||||
|
||||
fun getTopBeatmapScores(beatmapId: Int): OsuApiModels.BeatmapScores? {
|
||||
// TODO: Some beatmaps (beatmapId = 736729) return no results if mode is set to anything (specifically osu)
|
||||
val queryParams = mapOf(
|
||||
"mode" to "osu",
|
||||
"limit" to "100"
|
||||
)
|
||||
|
||||
val response = doRequest("https://osu.ppy.sh/api/v2/beatmaps/$beatmapId/scores?", queryParams)
|
||||
if(response == null) {
|
||||
this.logger.info("Error loading top beatmap scores")
|
||||
return null
|
||||
}
|
||||
|
||||
return if (response.statusCode() == 200) {
|
||||
serializer.decodeFromString(OsuApiModels.BeatmapScores.serializer(), response.body())
|
||||
} else {
|
||||
null
|
||||
}
|
||||
}
|
||||
|
||||
fun getTopUserScores(userId: Long): List<OsuApiModels.Score>? {
|
||||
val queryParams = mapOf(
|
||||
"mode" to "osu",
|
||||
"limit" to "100"
|
||||
)
|
||||
val response = doRequest("https://osu.ppy.sh/api/v2/users/$userId/scores/best?", queryParams)
|
||||
if(response == null) {
|
||||
this.logger.info("Error loading top user scores")
|
||||
return null
|
||||
}
|
||||
|
||||
return if (response.statusCode() == 200) {
|
||||
serializer.decodeFromString(ListSerializer(OsuApiModels.Score.serializer()), response.body())
|
||||
} else {
|
||||
null
|
||||
}
|
||||
}
|
||||
|
||||
fun searchBeatmapsets(cursor: OsuApiModels.BeatmapsetSearchResultCursor?): OsuApiModels.BeatmapsetSearchResult? {
|
||||
val queryParams = mutableMapOf(
|
||||
"s" to "ranked", // Status [only ranked]
|
||||
"m" to "0", // Mode [osu!std]
|
||||
"nsfw" to "true", // Include NSFW
|
||||
"sort" to "plays_desc", // Sort order
|
||||
)
|
||||
val response = this.doRequest("https://osu.ppy.sh/api/v2/beatmapsets/search/?", queryParams, appendToUrl = cursor.toString())
|
||||
if(response == null) {
|
||||
this.logger.info("Error loading beatmapset")
|
||||
return null
|
||||
}
|
||||
|
||||
return if (response.statusCode() == 200) {
|
||||
serializer.decodeFromString(OsuApiModels.BeatmapsetSearchResult.serializer(), response.body())
|
||||
} else {
|
||||
null
|
||||
}
|
||||
}
|
||||
|
||||
fun checkIfUserBanned(userId: Long): Boolean? {
|
||||
val response = this.doRequest("https://osu.ppy.sh/api/v2/users/$userId/osu?key=id", mapOf())
|
||||
if(response == null) {
|
||||
this.logger.info("Error loading user with userId = $userId")
|
||||
return null
|
||||
}
|
||||
|
||||
return response.statusCode() == 404
|
||||
}
|
||||
|
||||
fun getUsersBatch(userIds: List<Long>): OsuApiModels.UserBatchResponse? {
|
||||
val urlBuilder = StringBuilder("https://osu.ppy.sh/api/v2/users?")
|
||||
userIds.forEach { id ->
|
||||
urlBuilder.append("ids[]=$id&")
|
||||
}
|
||||
val url = urlBuilder.toString().removeSuffix("&")
|
||||
|
||||
val response = this.doRequest(url, emptyMap())
|
||||
if(response == null) {
|
||||
this.logger.info("Error loading users")
|
||||
return null
|
||||
}
|
||||
|
||||
return if (response.statusCode() == 200) {
|
||||
serializer.decodeFromString(OsuApiModels.UserBatchResponse.serializer(), response.body())
|
||||
} else {
|
||||
null
|
||||
}
|
||||
}
|
||||
|
||||
fun getUserProfile(userId: String, mode: String? = null, key: String? = null): OsuApiModels.UserExtended? {
|
||||
val accessToken = this.tokenService.getAccessToken()
|
||||
|
||||
val encodedUserId = userId.replace(' ', '_')
|
||||
val uriBuilder = StringBuilder("https://osu.ppy.sh/api/v2/users/$encodedUserId")
|
||||
|
||||
if (mode != null) uriBuilder.append("/$mode")
|
||||
if (key != null) uriBuilder.append("?key=$key")
|
||||
|
||||
val request = HttpRequest.newBuilder()
|
||||
.uri(URI.create(uriBuilder.toString()))
|
||||
.header("Accept", "application/json")
|
||||
.header("Content-Type", "application/json")
|
||||
.header("Authorization", "Bearer ${accessToken.access_token}")
|
||||
.GET()
|
||||
.build()
|
||||
|
||||
val response = this.sendWithRetry(request)
|
||||
if(response == null) {
|
||||
this.logger.info("Error loading user with userId = $userId")
|
||||
return null
|
||||
}
|
||||
|
||||
return if (response.statusCode() == 200) {
|
||||
serializer.decodeFromString(OsuApiModels.UserExtended.serializer(), response.body())
|
||||
} else {
|
||||
null
|
||||
}
|
||||
}
|
||||
|
||||
var rateLimitRemaining: Long = 0L
|
||||
var rateLimitTotal: Long = 0L
|
||||
|
||||
fun sendWithRetry(request: HttpRequest): HttpResponse<String>? {
|
||||
val waitTimes = listOf(15L, 30L, 60L)
|
||||
|
||||
for (waitTime in waitTimes) {
|
||||
val response = httpClient.send(request, HttpResponse.BodyHandlers.ofString())
|
||||
|
||||
this.logger.debug("Request: {}", request.uri())
|
||||
this.logger.debug("Result: {}", response.statusCode())
|
||||
this.logger.debug("")
|
||||
|
||||
when(response.statusCode()) {
|
||||
401 -> {
|
||||
// If the status code is 401, the access token has expired or something, refresh it.
|
||||
this.tokenService.getNewTokenAndSave()
|
||||
continue
|
||||
}
|
||||
429 -> {
|
||||
// Wait for the specified time before retrying if 429 is received
|
||||
this.logger.warn("Received 429 [${request.uri()}], waiting for $waitTime seconds before retrying...")
|
||||
Thread.sleep(waitTime * 1000)
|
||||
}
|
||||
else -> {
|
||||
val rateLimitRemainingCount = response.headers().firstValueAsLong("x-ratelimit-remaining")
|
||||
if(rateLimitRemainingCount.isPresent)
|
||||
this.rateLimitRemaining = rateLimitRemainingCount.asLong
|
||||
|
||||
val rateLimitTotalCount = response.headers().firstValueAsLong("x-ratelimit-limit")
|
||||
if(rateLimitTotalCount.isPresent)
|
||||
this.rateLimitTotal = rateLimitTotalCount.asLong
|
||||
|
||||
this.logger.debug("Rate limit remaining: $rateLimitRemaining/$rateLimitTotal")
|
||||
|
||||
return response
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
this.logger.error("Failed to get a successful response. URL: ${request.uri()}")
|
||||
return null
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,173 @@
|
||||
package com.nisemoe.nise.osu
|
||||
|
||||
import kotlinx.serialization.SerialName
|
||||
import kotlinx.serialization.Serializable
|
||||
|
||||
class OsuApiModels {
|
||||
|
||||
@Serializable
|
||||
data class UserCountry(
|
||||
val code: String,
|
||||
val name: String
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class UserStatistics(
|
||||
val count_100: Long,
|
||||
val count_300: Long,
|
||||
val count_50: Long,
|
||||
val count_miss: Long,
|
||||
val country_rank: Long?,
|
||||
val global_rank: Long?,
|
||||
val pp: Double,
|
||||
val ranked_score: Long,
|
||||
val hit_accuracy: Double,
|
||||
val play_count: Long,
|
||||
val play_time: Long,
|
||||
val total_score: Long,
|
||||
val total_hits: Long
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class UserExtended(
|
||||
// Documentation: https://osu.ppy.sh/docs/index.html#user
|
||||
val avatar_url: String,
|
||||
val id: Long,
|
||||
val username: String,
|
||||
|
||||
// Documentation: https://osu.ppy.sh/docs/index.html#userextended
|
||||
val join_date: String?,
|
||||
|
||||
val statistics: UserStatistics?,
|
||||
|
||||
val country: UserCountry?
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class UserBatchResponse(
|
||||
val users: List<UserExtended>
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class BeatmapsetSearchResultCursor(
|
||||
val play_count: Long?,
|
||||
val id: Long?
|
||||
) {
|
||||
|
||||
override fun toString(): String {
|
||||
return "cursor[play_count]=$play_count&cursor[id]=$id"
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
@Serializable
|
||||
data class BeatmapsetSearchResult(
|
||||
val beatmapsets: List<BeatmapsetCompact>,
|
||||
val total: Long?,
|
||||
val cursor: BeatmapsetSearchResultCursor?,
|
||||
val cursor_string: String?
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class ScoreBeatmap(
|
||||
val id: Int,
|
||||
val difficulty_rating: Double?,
|
||||
val version: String?
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class ScoreBeatmapset(
|
||||
val id: Int,
|
||||
val title: String?,
|
||||
val artist: String?,
|
||||
val creator: String?,
|
||||
val source: String?
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class Score(
|
||||
val id: Long?,
|
||||
val best_id: Long?,
|
||||
val user_id: Long,
|
||||
val accuracy: Double,
|
||||
val mods: List<String>,
|
||||
val score: Long,
|
||||
val max_combo: Int,
|
||||
val perfect: Boolean,
|
||||
val statistics: Statistics,
|
||||
val pp: Double?,
|
||||
val rank: Grade,
|
||||
val created_at: String,
|
||||
val replay: Boolean,
|
||||
val beatmap: ScoreBeatmap?,
|
||||
val beatmapset: ScoreBeatmapset?
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class Statistics(
|
||||
val count_50: Int?,
|
||||
val count_100: Int?,
|
||||
val count_300: Int?,
|
||||
val count_miss: Int?
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class BeatmapScores(
|
||||
val scores: List<Score>
|
||||
)
|
||||
|
||||
@Serializable
|
||||
enum class Grade {
|
||||
@SerialName("XH")
|
||||
SSH,
|
||||
|
||||
@SerialName("X")
|
||||
SS,
|
||||
|
||||
@SerialName("SH")
|
||||
SH,
|
||||
|
||||
@SerialName("S")
|
||||
S,
|
||||
|
||||
@SerialName("A")
|
||||
A,
|
||||
|
||||
@SerialName("B")
|
||||
B,
|
||||
|
||||
@SerialName("C")
|
||||
C,
|
||||
|
||||
@SerialName("D")
|
||||
D,
|
||||
|
||||
@SerialName("F")
|
||||
F
|
||||
}
|
||||
|
||||
@Serializable
|
||||
data class BeatmapCompact(
|
||||
val difficulty_rating: Double,
|
||||
val id: Int,
|
||||
val mode: String,
|
||||
val version: String,
|
||||
val beatmapset_id: Long
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class BeatmapsetCompact(
|
||||
val artist: String,
|
||||
val creator: String,
|
||||
val id: Long,
|
||||
val source: String,
|
||||
val title: String,
|
||||
val beatmaps: List<BeatmapCompact>?
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class ReplayResponse(
|
||||
val content: String
|
||||
)
|
||||
|
||||
}
|
||||
@ -0,0 +1,122 @@
|
||||
package com.nisemoe.nise.osu
|
||||
|
||||
import com.nisemoe.nise.AllowCacheSerialization
|
||||
import com.nisemoe.nise.service.CacheService
|
||||
import java.io.File
|
||||
import java.net.URI
|
||||
import java.net.URLEncoder
|
||||
import java.net.http.HttpClient
|
||||
import java.net.http.HttpRequest
|
||||
import java.net.http.HttpResponse
|
||||
import java.nio.charset.StandardCharsets
|
||||
import java.time.Instant
|
||||
import kotlinx.serialization.Serializable
|
||||
import kotlinx.serialization.json.Json
|
||||
import org.slf4j.LoggerFactory
|
||||
import org.springframework.beans.factory.InitializingBean
|
||||
import org.springframework.beans.factory.annotation.Value
|
||||
import org.springframework.stereotype.Service
|
||||
import java.time.Duration
|
||||
|
||||
@Service
|
||||
class TokenService(
|
||||
private val cacheService: CacheService
|
||||
) : InitializingBean {
|
||||
|
||||
private val logger = LoggerFactory.getLogger(javaClass)
|
||||
|
||||
private val httpClient: HttpClient = HttpClient.newBuilder()
|
||||
.version(HttpClient.Version.HTTP_2)
|
||||
.build()
|
||||
|
||||
@Serializable
|
||||
@AllowCacheSerialization
|
||||
data class AccessTokenResponse(
|
||||
val token_type: String,
|
||||
val expires_in: Int,
|
||||
val access_token: String
|
||||
)
|
||||
|
||||
@Serializable
|
||||
@AllowCacheSerialization
|
||||
data class TokenWithExpiry(
|
||||
val accessTokenResponse: AccessTokenResponse,
|
||||
val expiryTime: Long
|
||||
)
|
||||
|
||||
@Value("\${OSU_CLIENT_SECRET}")
|
||||
private lateinit var clientSecret: String
|
||||
|
||||
@Value("\${OSU_CLIENT_ID}")
|
||||
private lateinit var clientId: String
|
||||
|
||||
private var currentToken: TokenWithExpiry? = null
|
||||
|
||||
override fun afterPropertiesSet() {
|
||||
val tokenFromCache = this.cacheService.getVariable("osuToken", TokenWithExpiry::class.java)
|
||||
if(tokenFromCache != null) {
|
||||
currentToken = tokenFromCache
|
||||
this.logger.debug("Loaded osu! token from cache: {}", currentToken)
|
||||
}
|
||||
}
|
||||
|
||||
fun getAccessToken(): AccessTokenResponse {
|
||||
if(currentToken == null)
|
||||
return getNewTokenAndSave()
|
||||
|
||||
val now = Instant.now().epochSecond
|
||||
val remainingTime = currentToken!!.expiryTime.minus(now)
|
||||
|
||||
// Refresh the token if it is null, expired, or has less than 5 minutes left
|
||||
return if (remainingTime <= 300) {
|
||||
getNewTokenAndSave()
|
||||
} else {
|
||||
currentToken!!.accessTokenResponse
|
||||
}
|
||||
}
|
||||
|
||||
fun getNewTokenAndSave(): AccessTokenResponse {
|
||||
val newToken = getNewAccessToken()
|
||||
val expiryTime = calculateExpiryTime(newToken)
|
||||
val tokenWithExpiry = TokenWithExpiry(newToken, expiryTime)
|
||||
currentToken = tokenWithExpiry
|
||||
this.cacheService.setVariable(
|
||||
key = "osuToken",
|
||||
value = tokenWithExpiry,
|
||||
ttl = Duration.ofSeconds(expiryTime - Instant.now().epochSecond)
|
||||
)
|
||||
return newToken
|
||||
}
|
||||
|
||||
private fun calculateExpiryTime(tokenResponse: AccessTokenResponse): Long =
|
||||
Instant.now().epochSecond + tokenResponse.expires_in - 120
|
||||
|
||||
private fun getNewAccessToken(): AccessTokenResponse {
|
||||
val parameters = mapOf(
|
||||
"client_id" to clientId,
|
||||
"client_secret" to clientSecret,
|
||||
"grant_type" to "client_credentials",
|
||||
"scope" to "public"
|
||||
)
|
||||
|
||||
val form = parameters.map { (k, v) ->
|
||||
"${URLEncoder.encode(k, StandardCharsets.UTF_8)}=${URLEncoder.encode(v, StandardCharsets.UTF_8)}"
|
||||
}.joinToString("&")
|
||||
|
||||
val request = HttpRequest.newBuilder()
|
||||
.uri(URI.create("https://osu.ppy.sh/oauth/token"))
|
||||
.header("Accept", "application/json")
|
||||
.header("Content-Type", "application/x-www-form-urlencoded")
|
||||
.POST(HttpRequest.BodyPublishers.ofString(form))
|
||||
.build()
|
||||
|
||||
val response = httpClient.send(request, HttpResponse.BodyHandlers.ofString())
|
||||
|
||||
if (response.statusCode() == 200) {
|
||||
return Json.decodeFromString(AccessTokenResponse.serializer(), response.body())
|
||||
} else {
|
||||
throw RuntimeException("Failed to obtain access token: ${response.body()}")
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,32 @@
|
||||
package com.nisemoe.nise.rss
|
||||
|
||||
import com.nisemoe.nise.scheduler.GlobalCache
|
||||
import org.springframework.http.MediaType
|
||||
import org.springframework.http.ResponseEntity
|
||||
import org.springframework.web.bind.annotation.GetMapping
|
||||
import org.springframework.web.bind.annotation.RestController
|
||||
import java.time.LocalDateTime
|
||||
import java.time.OffsetDateTime
|
||||
|
||||
@RestController
|
||||
class RssFeedController(
|
||||
private val globalCache: GlobalCache
|
||||
) {
|
||||
|
||||
data class IntermeriaryFeedItem(
|
||||
val title: String,
|
||||
val guid: String,
|
||||
val link: String,
|
||||
val description: String,
|
||||
val pubDate: OffsetDateTime
|
||||
)
|
||||
|
||||
@GetMapping(path = ["rss", "rss.xml"], produces = [MediaType.APPLICATION_RSS_XML_VALUE])
|
||||
fun getRssFeed(): ResponseEntity<RssFeed> {
|
||||
val feed = this.globalCache.rssFeed
|
||||
?: return ResponseEntity.status(503).build()
|
||||
|
||||
return ResponseEntity.ok(feed)
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,50 @@
|
||||
package com.nisemoe.nise.rss
|
||||
|
||||
import com.fasterxml.jackson.dataformat.xml.annotation.JacksonXmlElementWrapper
|
||||
import com.fasterxml.jackson.dataformat.xml.annotation.JacksonXmlProperty
|
||||
import com.fasterxml.jackson.dataformat.xml.annotation.JacksonXmlRootElement
|
||||
import kotlinx.serialization.Serializable
|
||||
|
||||
@JacksonXmlRootElement(localName = "rss")
|
||||
@Serializable
|
||||
data class RssFeed(
|
||||
@JacksonXmlProperty(isAttribute = true)
|
||||
val version: String = "2.0",
|
||||
val channel: Channel,
|
||||
|
||||
@JacksonXmlProperty(localName = "xmlns:atom", isAttribute = true)
|
||||
val atomLink: String = "http://www.w3.org/2005/Atom"
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class Channel(
|
||||
val title: String,
|
||||
val link: String,
|
||||
val description: String,
|
||||
@JacksonXmlProperty(localName = "lastBuildDate")
|
||||
val lastBuildDate: String,
|
||||
@JacksonXmlElementWrapper(useWrapping = false)
|
||||
val item: List<Item>,
|
||||
@JacksonXmlProperty(localName = "atom:link")
|
||||
val atomLink: AtomLink
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class AtomLink(
|
||||
@JacksonXmlProperty(localName = "href", isAttribute = true)
|
||||
val href: String,
|
||||
@JacksonXmlProperty(localName = "rel", isAttribute = true)
|
||||
val rel: String = "self",
|
||||
@JacksonXmlProperty(localName = "type", isAttribute = true)
|
||||
val type: String = "application/rss+xml"
|
||||
)
|
||||
|
||||
@Serializable
|
||||
data class Item(
|
||||
val title: String,
|
||||
val guid: String,
|
||||
val link: String,
|
||||
val description: String,
|
||||
@JacksonXmlProperty(localName = "pubDate")
|
||||
val pubDate: String
|
||||
)
|
||||
128
nise-backend/src/main/kotlin/com/nisemoe/nise/rss/RssService.kt
Normal file
128
nise-backend/src/main/kotlin/com/nisemoe/nise/rss/RssService.kt
Normal file
@ -0,0 +1,128 @@
|
||||
package com.nisemoe.nise.rss
|
||||
|
||||
import com.nisemoe.generated.tables.references.BEATMAPS
|
||||
import com.nisemoe.generated.tables.references.SCORES
|
||||
import com.nisemoe.generated.tables.references.SCORES_SIMILARITY
|
||||
import com.nisemoe.generated.tables.references.USERS
|
||||
import com.nisemoe.nise.database.ScoreService
|
||||
import org.jooq.DSLContext
|
||||
import org.springframework.stereotype.Service
|
||||
import java.time.ZoneOffset
|
||||
import java.time.format.DateTimeFormatter
|
||||
import java.util.*
|
||||
|
||||
@Service
|
||||
class RssService(
|
||||
private val dslContext: DSLContext,
|
||||
private val scoreService: ScoreService
|
||||
) {
|
||||
|
||||
fun generateFeed(): RssFeed {
|
||||
val items = mutableListOf<RssFeedController.IntermeriaryFeedItem>()
|
||||
addSuspiciousScores(items)
|
||||
addStolenReplays(items)
|
||||
|
||||
// Sort by date DESC
|
||||
items.sortByDescending { it.pubDate }
|
||||
|
||||
// Limit to 50
|
||||
items.subList(0, items.size.coerceAtMost(50))
|
||||
|
||||
val channel = Channel(
|
||||
title = "nise.moe's feed and sneed",
|
||||
link = "https://nise.moe/rss",
|
||||
description = "Feed of *sus* scores for osu!std - /nise.moe/",
|
||||
lastBuildDate = Date().toInstant().atZone(ZoneOffset.UTC).format(DateTimeFormatter.RFC_1123_DATE_TIME),
|
||||
item = items.map {
|
||||
Item(
|
||||
title = it.title,
|
||||
guid = it.guid,
|
||||
link = it.link,
|
||||
description = it.description,
|
||||
pubDate = it.pubDate.format(DateTimeFormatter.RFC_1123_DATE_TIME)
|
||||
)
|
||||
|
||||
},
|
||||
atomLink = AtomLink(href = "https://nise.moe/rss.xml")
|
||||
)
|
||||
|
||||
return RssFeed(channel = channel)
|
||||
}
|
||||
|
||||
private fun addStolenReplays(items: MutableList<RssFeedController.IntermeriaryFeedItem>) {
|
||||
val replays = dslContext
|
||||
.select()
|
||||
.from(SCORES_SIMILARITY)
|
||||
.join(ScoreService.osuScoreAlias1).on(ScoreService.osuScoreAlias1.REPLAY_ID.eq(SCORES_SIMILARITY.REPLAY_ID_1))
|
||||
.join(ScoreService.osuUserAlias1).on(ScoreService.osuScoreAlias1.USER_ID.eq(ScoreService.osuUserAlias1.USER_ID))
|
||||
.leftJoin(ScoreService.osuScoreAlias2).on(ScoreService.osuScoreAlias2.REPLAY_ID.eq(SCORES_SIMILARITY.REPLAY_ID_2))
|
||||
.leftJoin(ScoreService.osuUserAlias2).on(ScoreService.osuScoreAlias2.USER_ID.eq(ScoreService.osuUserAlias2.USER_ID))
|
||||
.join(BEATMAPS).on(BEATMAPS.BEATMAP_ID.eq(ScoreService.osuScoreAlias1.BEATMAP_ID))
|
||||
.where(SCORES_SIMILARITY.SIMILARITY.lt(10.0))
|
||||
.and(ScoreService.osuScoreAlias1.IS_BANNED.eq(false))
|
||||
.and(ScoreService.osuScoreAlias2.IS_BANNED.eq(false))
|
||||
.orderBy(SCORES_SIMILARITY.CREATED_AT.desc())
|
||||
.limit(25)
|
||||
.fetch()
|
||||
|
||||
for (result in replays) {
|
||||
val score1 = result.into(ScoreService.osuScoreAlias1)
|
||||
val user1 = result.into(ScoreService.osuUserAlias1)
|
||||
val score2 = result.into(ScoreService.osuScoreAlias2)
|
||||
val user2 = result.into(ScoreService.osuUserAlias2)
|
||||
val beatmap = result.into(BEATMAPS)
|
||||
|
||||
if(score1.addedAt != null) {
|
||||
val item = RssFeedController.IntermeriaryFeedItem(
|
||||
title = "Possible stolen replay",
|
||||
guid = "https://nise.moe/p/${score1.replayId}/${score2.replayId}",
|
||||
link = "https://nise.moe/p/${score1.replayId}/${score2.replayId}",
|
||||
description = "Similarity: ${result[SCORES_SIMILARITY.SIMILARITY]}%\n" +
|
||||
"Replay by ${user1.username} on ${beatmap.artist} - ${beatmap.title} [${beatmap.version}] (${beatmap.starRating} stars)\n" +
|
||||
"Replay by ${user2.username} on ${beatmap.artist} - ${beatmap.title} [${beatmap.version}] (${beatmap.starRating} stars)",
|
||||
pubDate = score1.addedAt!!
|
||||
)
|
||||
items.add(item)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private fun addSuspiciousScores(items: MutableList<RssFeedController.IntermeriaryFeedItem>) {
|
||||
val suspiciousScores = dslContext.select(
|
||||
USERS.USERNAME,
|
||||
SCORES.REPLAY_ID,
|
||||
SCORES.DATE,
|
||||
BEATMAPS.ARTIST,
|
||||
BEATMAPS.TITLE,
|
||||
BEATMAPS.VERSION,
|
||||
BEATMAPS.STAR_RATING
|
||||
)
|
||||
.from(SCORES)
|
||||
.join(USERS).on(SCORES.USER_ID.eq(USERS.USER_ID))
|
||||
.join(BEATMAPS).on(SCORES.BEATMAP_ID.eq(BEATMAPS.BEATMAP_ID))
|
||||
.where(SCORES.UR.lessOrEqual(25.0))
|
||||
.and(SCORES.IS_BANNED.eq(false))
|
||||
.orderBy(SCORES.ADDED_AT.desc())
|
||||
.limit(25)
|
||||
.fetch()
|
||||
|
||||
for (result in suspiciousScores) {
|
||||
val score = result.into(SCORES)
|
||||
val user = result.into(USERS)
|
||||
val beatmap = result.into(BEATMAPS)
|
||||
|
||||
if(score.addedAt != null) {
|
||||
val item = RssFeedController.IntermeriaryFeedItem(
|
||||
title = "Suspicious score by ${user.username}",
|
||||
guid = "https://nise.moe/s/${score.replayId}",
|
||||
link = "https://nise.moe/s/${score.replayId}",
|
||||
description = "New score by ${user.username} on ${beatmap.artist} - ${beatmap.title} [${beatmap.version}] (${beatmap.starRating} stars)",
|
||||
pubDate = score.addedAt!!
|
||||
)
|
||||
items.add(item)
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,162 @@
|
||||
package com.nisemoe.nise.scheduler
|
||||
|
||||
import com.nisemoe.generated.tables.records.ScoresRecord
|
||||
import com.nisemoe.generated.tables.references.SCORES
|
||||
import com.nisemoe.generated.tables.references.SCORES_JUDGEMENTS
|
||||
import com.nisemoe.nise.integrations.CircleguardService
|
||||
import com.nisemoe.nise.Format.Companion.fromJudgementType
|
||||
import kotlinx.coroutines.Dispatchers
|
||||
import kotlinx.coroutines.channels.Channel
|
||||
import kotlinx.coroutines.joinAll
|
||||
import kotlinx.coroutines.launch
|
||||
import kotlinx.coroutines.runBlocking
|
||||
import org.jooq.DSLContext
|
||||
import org.slf4j.LoggerFactory
|
||||
import org.springframework.beans.factory.annotation.Value
|
||||
import org.springframework.context.annotation.Profile
|
||||
import org.springframework.scheduling.annotation.Scheduled
|
||||
import org.springframework.stereotype.Service
|
||||
import java.time.OffsetDateTime
|
||||
|
||||
@Profile("old_scores")
|
||||
@Service
|
||||
class FixOldScores(
|
||||
private val dslContext: DSLContext,
|
||||
private val circleguardService: CircleguardService
|
||||
){
|
||||
|
||||
@Value("\${OLD_SCORES_WORKERS:4}")
|
||||
private var workers: Int = 4
|
||||
|
||||
@Value("\${OLD_SCORES_PAGE_SIZE:5000}")
|
||||
private var pageSize: Int = 5000
|
||||
|
||||
val CURRENT_VERSION = 1
|
||||
|
||||
private val logger = LoggerFactory.getLogger(javaClass)
|
||||
|
||||
data class Task(val offset: Int, val limit: Int)
|
||||
|
||||
@Scheduled(fixedDelay = 40000, initialDelay = 0)
|
||||
fun fixOldScores() {
|
||||
val condition = SCORES.REPLAY.isNotNull.and(SCORES.VERSION.lessThan(CURRENT_VERSION))
|
||||
val totalRows = dslContext.fetchCount(SCORES, condition)
|
||||
|
||||
if(totalRows <= 0) {
|
||||
this.logger.warn("Fixing old scores but there are none, total rows: $totalRows")
|
||||
return
|
||||
}
|
||||
|
||||
val tasks = Channel<Task>(Channel.UNLIMITED)
|
||||
val readyForWork = Channel<Unit>(Channel.CONFLATED)
|
||||
|
||||
val numTasks = (totalRows + pageSize - 1) / pageSize
|
||||
val taskThreshold = workers * 2
|
||||
var activeTasks = 0
|
||||
|
||||
runBlocking {
|
||||
launch {
|
||||
for (i in 0 until numTasks) {
|
||||
while (activeTasks >= taskThreshold) {
|
||||
readyForWork.receive()
|
||||
}
|
||||
val offset = i * pageSize
|
||||
tasks.send(Task(offset, pageSize))
|
||||
activeTasks++
|
||||
}
|
||||
tasks.close()
|
||||
}
|
||||
|
||||
val workerJobs = List(workers) { workerId ->
|
||||
launch(Dispatchers.Default) {
|
||||
for (task in tasks) {
|
||||
val scores = dslContext.selectFrom(SCORES)
|
||||
.where(condition)
|
||||
.orderBy(SCORES.DATE.desc())
|
||||
.limit(task.limit)
|
||||
.offset(task.offset)
|
||||
.fetchInto(ScoresRecord::class.java)
|
||||
|
||||
scores.forEach { score ->
|
||||
println("Worker $workerId processing score_id: ${score.id} | Active tasks: $activeTasks")
|
||||
try {
|
||||
processScore(score)
|
||||
} catch(exception: Exception) {
|
||||
logger.error(exception.stackTraceToString())
|
||||
}
|
||||
}
|
||||
activeTasks--
|
||||
readyForWork.send(Unit)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
workerJobs.joinAll()
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
fun processScore(score: ScoresRecord) {
|
||||
|
||||
val processedReplay: CircleguardService.ReplayResponse? = try {
|
||||
this.circleguardService.processReplay(
|
||||
replayData = score.replay!!.decodeToString(), beatmapId = score.beatmapId!!, mods = score.mods ?: 0
|
||||
).get()
|
||||
} catch (e: Exception) {
|
||||
this.logger.error("Circleguard failed to process replay with score_id: ${score.id}")
|
||||
return
|
||||
}
|
||||
|
||||
if (processedReplay?.error_skewness == null || processedReplay.judgements.isEmpty()) {
|
||||
this.logger.error("Circleguard returned null and failed to process replay with score_id: ${score.id}")
|
||||
return
|
||||
}
|
||||
|
||||
val scoreId = dslContext.update(SCORES)
|
||||
.set(SCORES.UR, processedReplay.ur)
|
||||
.set(SCORES.ADJUSTED_UR, processedReplay.adjusted_ur)
|
||||
.set(SCORES.FRAMETIME, processedReplay.frametime)
|
||||
.set(SCORES.SNAPS, processedReplay.snaps)
|
||||
.set(SCORES.MEAN_ERROR, processedReplay.mean_error)
|
||||
.set(SCORES.ERROR_VARIANCE, processedReplay.error_variance)
|
||||
.set(SCORES.ERROR_STANDARD_DEVIATION, processedReplay.error_standard_deviation)
|
||||
.set(SCORES.MINIMUM_ERROR, processedReplay.minimum_error)
|
||||
.set(SCORES.MAXIMUM_ERROR, processedReplay.maximum_error)
|
||||
.set(SCORES.ERROR_RANGE, processedReplay.error_range)
|
||||
.set(SCORES.ERROR_COEFFICIENT_OF_VARIATION, processedReplay.error_coefficient_of_variation)
|
||||
.set(SCORES.ERROR_KURTOSIS, processedReplay.error_kurtosis)
|
||||
.set(SCORES.ERROR_SKEWNESS, processedReplay.error_skewness)
|
||||
.set(SCORES.SNAPS, processedReplay.snaps)
|
||||
.set(SCORES.EDGE_HITS, processedReplay.edge_hits)
|
||||
.where(SCORES.REPLAY_ID.eq(score.replayId))
|
||||
.returningResult(SCORES.ID)
|
||||
.fetchOne()?.getValue(SCORES.ID)
|
||||
|
||||
if (scoreId == null) {
|
||||
this.logger.debug("Weird, failed to insert score into scores table. At least, it did not return an ID.")
|
||||
return
|
||||
}
|
||||
|
||||
dslContext.update(SCORES)
|
||||
.set(SCORES.VERSION, CURRENT_VERSION)
|
||||
.where(SCORES.ID.eq(scoreId))
|
||||
.execute()
|
||||
|
||||
val judgementsExist = dslContext.fetchExists(SCORES_JUDGEMENTS, SCORES_JUDGEMENTS.SCORE_ID.eq(scoreId))
|
||||
if(!judgementsExist) {
|
||||
for (judgement in processedReplay.judgements) {
|
||||
dslContext.insertInto(SCORES_JUDGEMENTS)
|
||||
.set(SCORES_JUDGEMENTS.TIME, judgement.time)
|
||||
.set(SCORES_JUDGEMENTS.X, judgement.x)
|
||||
.set(SCORES_JUDGEMENTS.Y, judgement.y)
|
||||
.set(SCORES_JUDGEMENTS.TYPE, fromJudgementType(judgement.type))
|
||||
.set(SCORES_JUDGEMENTS.DISTANCE_EDGE, judgement.distance_edge)
|
||||
.set(SCORES_JUDGEMENTS.DISTANCE_CENTER, judgement.distance_center)
|
||||
.set(SCORES_JUDGEMENTS.ERROR, judgement.error)
|
||||
.set(SCORES_JUDGEMENTS.SCORE_ID, scoreId)
|
||||
.execute()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,48 @@
|
||||
package com.nisemoe.nise.scheduler
|
||||
|
||||
import com.nisemoe.nise.SimilarReplayEntry
|
||||
import com.nisemoe.nise.Statistics
|
||||
import com.nisemoe.nise.SuspiciousScoreEntry
|
||||
import com.nisemoe.nise.database.ScoreService
|
||||
import com.nisemoe.nise.database.StatisticsService
|
||||
import com.nisemoe.nise.rss.RssFeed
|
||||
import com.nisemoe.nise.rss.RssService
|
||||
import kotlinx.coroutines.async
|
||||
import kotlinx.coroutines.runBlocking
|
||||
import org.slf4j.LoggerFactory
|
||||
import org.springframework.scheduling.annotation.Scheduled
|
||||
import org.springframework.stereotype.Service
|
||||
|
||||
@Service
|
||||
class GlobalCache(
|
||||
private val scoreService: ScoreService,
|
||||
private val statisticsService: StatisticsService,
|
||||
private val rssService: RssService
|
||||
) {
|
||||
|
||||
private val logger = LoggerFactory.getLogger(javaClass)
|
||||
|
||||
var similarReplays: List<SimilarReplayEntry>? = null
|
||||
var suspiciousScores: List<SuspiciousScoreEntry>? = null
|
||||
var statistics: Statistics? = null
|
||||
var rssFeed: RssFeed? = null
|
||||
|
||||
// 20 minutes to ms = 1200000
|
||||
@Scheduled(fixedDelay = 1200000, initialDelay = 0)
|
||||
fun updateCaches() {
|
||||
logger.info("Updating the cache!")
|
||||
|
||||
runBlocking {
|
||||
val rssFeedDeferred = async { rssService.generateFeed() }
|
||||
val statisticsDeferred = async { statisticsService.getStatistics() }
|
||||
val similarReplaysDeferred = async { scoreService.getSimilarReplays() }
|
||||
val suspiciousScoresDeferred = async { scoreService.getSuspiciousScores() }
|
||||
|
||||
rssFeed = rssFeedDeferred.await()
|
||||
statistics = statisticsDeferred.await()
|
||||
similarReplays = similarReplaysDeferred.await()
|
||||
suspiciousScores = suspiciousScoresDeferred.await()
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,596 @@
|
||||
package com.nisemoe.nise.scheduler
|
||||
|
||||
import com.nisemoe.generated.tables.records.ScoresRecord
|
||||
import com.nisemoe.generated.tables.references.*
|
||||
import com.nisemoe.konata.Replay
|
||||
import com.nisemoe.konata.ReplaySetComparison
|
||||
import com.nisemoe.konata.compareReplaySet
|
||||
import com.nisemoe.nise.Format.Companion.fromJudgementType
|
||||
import com.nisemoe.nise.database.ScoreService
|
||||
import com.nisemoe.nise.database.UserService
|
||||
import com.nisemoe.nise.integrations.CircleguardService
|
||||
import com.nisemoe.nise.integrations.DiscordEmbed
|
||||
import com.nisemoe.nise.integrations.DiscordService
|
||||
import com.nisemoe.nise.osu.Mod
|
||||
import com.nisemoe.nise.osu.OsuApi
|
||||
import com.nisemoe.nise.osu.OsuApiModels
|
||||
import com.nisemoe.nise.service.CacheService
|
||||
import com.nisemoe.nise.service.UpdateUserQueueService
|
||||
import kotlinx.serialization.Serializable
|
||||
import org.jooq.DSLContext
|
||||
import org.jooq.Query
|
||||
import org.slf4j.LoggerFactory
|
||||
import org.springframework.beans.factory.InitializingBean
|
||||
import org.springframework.beans.factory.annotation.Value
|
||||
import org.springframework.context.annotation.Profile
|
||||
import org.springframework.http.MediaType
|
||||
import org.springframework.http.ResponseEntity
|
||||
import org.springframework.messaging.simp.SimpMessagingTemplate
|
||||
import org.springframework.scheduling.annotation.Scheduled
|
||||
import org.springframework.stereotype.Service
|
||||
import org.springframework.util.StopWatch
|
||||
import org.springframework.web.bind.annotation.GetMapping
|
||||
import org.springframework.web.bind.annotation.RestController
|
||||
import java.time.LocalDateTime
|
||||
import java.time.OffsetDateTime
|
||||
|
||||
@Service
|
||||
@RestController
|
||||
@Profile("updater")
|
||||
class ImportScores(
|
||||
private val dslContext: DSLContext,
|
||||
private val userService: UserService,
|
||||
private val osuApi: OsuApi,
|
||||
private val cacheService: CacheService,
|
||||
private val discordService: DiscordService,
|
||||
private val scoreService: ScoreService,
|
||||
private val updateUserQueueService: UpdateUserQueueService,
|
||||
private val circleguardService: CircleguardService,
|
||||
private val messagingTemplate: SimpMessagingTemplate
|
||||
) : InitializingBean {
|
||||
|
||||
private val userToUpdateBucket = mutableListOf<Long>()
|
||||
|
||||
override fun afterPropertiesSet() {
|
||||
try {
|
||||
val userBucketCached = cacheService.getVariable("userToUpdateBucket", List::class.java)
|
||||
if(userBucketCached != null) {
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
userToUpdateBucket.addAll(userBucketCached as List<Long>)
|
||||
}
|
||||
} catch (e: Exception) {
|
||||
logger.error("Failed to fetch userToUpdateBucket from cache.")
|
||||
this.cacheService.deleteVariable("userToUpdateBucket")
|
||||
}
|
||||
}
|
||||
|
||||
val CURRENT_VERSION = 1
|
||||
|
||||
@Value("\${WEBHOOK_URL}")
|
||||
private lateinit var webhookUrl: String
|
||||
|
||||
private val logger = LoggerFactory.getLogger(javaClass)
|
||||
|
||||
private final val sleepTimeInMs = 2000L
|
||||
|
||||
private final val UPDATE_USER_EVERY_DAYS = 7L
|
||||
private final val UPDATE_BANNED_USERS_EVERY_DAYS = 3L
|
||||
|
||||
data class UpdaterStatistics(
|
||||
var currentBeatmapsetPage: Int = 0,
|
||||
|
||||
var currentScore: Int = 0,
|
||||
var totalScores: Int = 0,
|
||||
|
||||
var beatmapsSkippedBecauseTooRecent: Int = 0,
|
||||
var beatmapsAddedToDatabase: Int = 0,
|
||||
var usersAddedToDatabase: Int = 0,
|
||||
|
||||
var scoresSkippedBecauseAlreadyExists: Int = 0,
|
||||
var scoresAddedToDatabase: Int = 0,
|
||||
var scoresWithReplayAndAnalyzed: Int = 0,
|
||||
|
||||
var rateLimitTotal: Long = 0,
|
||||
var rateLimitRemaining: Long = 0,
|
||||
) {
|
||||
|
||||
fun toMarkdown(): String {
|
||||
return """
|
||||
|**Updater Statistics:**
|
||||
|- Current Beatmapset Page: $currentBeatmapsetPage
|
||||
|- Beatmaps Skipped Because Too Recent: $beatmapsSkippedBecauseTooRecent
|
||||
|- Beatmaps Added To Database: $beatmapsAddedToDatabase
|
||||
|- Users Added To Database: $usersAddedToDatabase
|
||||
|- Scores Skipped Because Already Exists: $scoresSkippedBecauseAlreadyExists
|
||||
|- Scores Added To Database: $scoresAddedToDatabase
|
||||
|- Scores With Replay And Analyzed: $scoresWithReplayAndAnalyzed
|
||||
""".trimMargin()
|
||||
}
|
||||
|
||||
}
|
||||
private var statistics: UpdaterStatistics = UpdaterStatistics()
|
||||
|
||||
@GetMapping("status", produces = [MediaType.APPLICATION_JSON_VALUE])
|
||||
fun getStatus(): ResponseEntity<UpdaterStatistics> {
|
||||
val statisticsDto = this.statistics
|
||||
this.statistics.rateLimitTotal = this.osuApi.rateLimitTotal
|
||||
this.statistics.rateLimitRemaining = this.osuApi.rateLimitRemaining
|
||||
return ResponseEntity.ok(statisticsDto)
|
||||
}
|
||||
|
||||
// Restart a minute after it's completed.
|
||||
@Scheduled(fixedDelay = 60000, initialDelay = 0)
|
||||
fun updateStuffScheduler() {
|
||||
this.statistics = UpdaterStatistics()
|
||||
try {
|
||||
this.doUpdateStuff()
|
||||
} catch (exception: Exception) {
|
||||
val errorEmbed = DiscordEmbed(
|
||||
title = "Exception ocurred",
|
||||
description = exception.stackTraceToString()
|
||||
)
|
||||
this.discordService.sendEmbeds(
|
||||
this.webhookUrl,
|
||||
listOf(errorEmbed)
|
||||
)
|
||||
this.logger.error(exception.stackTraceToString())
|
||||
} finally {
|
||||
val statisticsEmbed = DiscordEmbed(
|
||||
title = "Statistics",
|
||||
description = this.statistics.toMarkdown()
|
||||
)
|
||||
this.discordService.sendEmbeds(
|
||||
this.webhookUrl,
|
||||
listOf(statisticsEmbed)
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
fun updateScoreWeaving(beatmap: OsuApiModels.BeatmapCompact, score: OsuApiModels.Score) {
|
||||
// Check if we have any requests to update a specific user
|
||||
val queue = this.updateUserQueueService.getQueue()
|
||||
if(queue.isNotEmpty()) {
|
||||
this.logger.info("Processing ${queue.size} users from the queue.")
|
||||
}
|
||||
|
||||
for(userId in queue) {
|
||||
val topUserScores = this.osuApi.getTopUserScores(userId = userId)
|
||||
Thread.sleep(this.sleepTimeInMs)
|
||||
if(topUserScores != null) {
|
||||
|
||||
val userExists = dslContext.fetchExists(USERS, USERS.USER_ID.eq(userId), USERS.SYS_LAST_UPDATE.greaterOrEqual(LocalDateTime.now().minusDays(UPDATE_USER_EVERY_DAYS)))
|
||||
if(!userExists) {
|
||||
val apiUser = this.osuApi.getUserProfile(userId = userId.toString(), mode = "osu", key = "id")
|
||||
if(apiUser != null) {
|
||||
this.userService.insertApiUser(apiUser)
|
||||
this.statistics.usersAddedToDatabase++
|
||||
} else {
|
||||
this.logger.error("Failed to fetch user with id = $userId")
|
||||
}
|
||||
}
|
||||
|
||||
for(topScore in topUserScores) {
|
||||
if(topScore.beatmap != null && topScore.beatmapset != null) {
|
||||
val beatmapExists = dslContext.fetchExists(BEATMAPS, BEATMAPS.BEATMAP_ID.eq(topScore.beatmap.id))
|
||||
if (!beatmapExists) {
|
||||
dslContext.insertInto(BEATMAPS)
|
||||
.set(BEATMAPS.BEATMAP_ID, topScore.beatmap.id)
|
||||
.set(BEATMAPS.BEATMAPSET_ID, topScore.beatmapset.id)
|
||||
.set(BEATMAPS.STAR_RATING, topScore.beatmap.difficulty_rating)
|
||||
.set(BEATMAPS.VERSION, topScore.beatmap.version)
|
||||
.set(BEATMAPS.ARTIST, topScore.beatmapset.artist)
|
||||
.set(BEATMAPS.SOURCE, topScore.beatmapset.source)
|
||||
.set(BEATMAPS.TITLE, topScore.beatmapset.title)
|
||||
.set(BEATMAPS.SOURCE, topScore.beatmapset.source)
|
||||
.set(BEATMAPS.CREATOR, topScore.beatmapset.creator)
|
||||
.execute()
|
||||
this.statistics.beatmapsAddedToDatabase++
|
||||
}
|
||||
|
||||
this.insertAndProcessNewScore(topScore.beatmap.id, topScore)
|
||||
}
|
||||
}
|
||||
}
|
||||
this.updateUserQueueService.setUserAsProcessed(userId)
|
||||
}
|
||||
|
||||
// Check if we need to check the banned users
|
||||
val lastBannedUserCheck = this.cacheService.getVariable("lastBannedUserCheck", LocalDateTime::class.java)
|
||||
if(lastBannedUserCheck == null || lastBannedUserCheck.isBefore(LocalDateTime.now().minusDays(UPDATE_BANNED_USERS_EVERY_DAYS))) {
|
||||
this.logger.info("Checking for banned users.")
|
||||
|
||||
val suspiciousScores = dslContext.select(SCORES.USER_ID)
|
||||
.from(SCORES)
|
||||
.where(SCORES.UR.lessOrEqual(25.0).and(SCORES.IS_BANNED.isFalse))
|
||||
.groupBy(SCORES.USER_ID)
|
||||
.fetchInto(Long::class.java)
|
||||
|
||||
val stolenReplays = dslContext
|
||||
.select(ScoreService.osuScoreAlias1.USER_ID, ScoreService.osuScoreAlias2.USER_ID)
|
||||
.from(SCORES_SIMILARITY)
|
||||
.join(ScoreService.osuScoreAlias1).on(ScoreService.osuScoreAlias1.REPLAY_ID.eq(SCORES_SIMILARITY.REPLAY_ID_1))
|
||||
.join(ScoreService.osuUserAlias1).on(ScoreService.osuScoreAlias1.USER_ID.eq(ScoreService.osuUserAlias1.USER_ID))
|
||||
.leftJoin(ScoreService.osuScoreAlias2).on(ScoreService.osuScoreAlias2.REPLAY_ID.eq(SCORES_SIMILARITY.REPLAY_ID_2))
|
||||
.leftJoin(ScoreService.osuUserAlias2).on(ScoreService.osuScoreAlias2.USER_ID.eq(ScoreService.osuUserAlias2.USER_ID))
|
||||
.join(BEATMAPS).on(BEATMAPS.BEATMAP_ID.eq(ScoreService.osuScoreAlias1.BEATMAP_ID))
|
||||
.where(SCORES_SIMILARITY.SIMILARITY.lt(10.0))
|
||||
.and(ScoreService.osuScoreAlias1.IS_BANNED.eq(false))
|
||||
.and(ScoreService.osuScoreAlias2.IS_BANNED.eq(false))
|
||||
.groupBy(ScoreService.osuScoreAlias1.USER_ID, ScoreService.osuScoreAlias2.USER_ID)
|
||||
.fetch()
|
||||
|
||||
// Sum lists and remove duplicate user ids
|
||||
val suspiciousUserIds = (suspiciousScores + stolenReplays.map {
|
||||
it.get(
|
||||
ScoreService.osuScoreAlias1.USER_ID,
|
||||
Long::class.java
|
||||
)
|
||||
} + stolenReplays.map { it.get(ScoreService.osuScoreAlias2.USER_ID, Long::class.java) }).distinct()
|
||||
|
||||
for(userId in suspiciousUserIds) {
|
||||
val isBanned = this.osuApi.checkIfUserBanned(userId)
|
||||
if(isBanned == true) {
|
||||
dslContext.update(SCORES)
|
||||
.set(SCORES.IS_BANNED, true)
|
||||
.where(SCORES.USER_ID.eq(userId))
|
||||
.execute()
|
||||
}
|
||||
Thread.sleep(this.sleepTimeInMs)
|
||||
}
|
||||
|
||||
this.cacheService.setVariable("lastBannedUserCheck", LocalDateTime.now())
|
||||
}
|
||||
|
||||
// Nothing? Good, process the score as usual.
|
||||
this.insertAndProcessNewScore(beatmap.id, score)
|
||||
}
|
||||
|
||||
fun doUpdateStuff() {
|
||||
var cursor: OsuApiModels.BeatmapsetSearchResultCursor? = null
|
||||
|
||||
do {
|
||||
val searchResults = this.osuApi.searchBeatmapsets(cursor = cursor)
|
||||
this.statistics.currentBeatmapsetPage++
|
||||
Thread.sleep(this.sleepTimeInMs)
|
||||
|
||||
if (searchResults == null) {
|
||||
this.logger.error("Failed to fetch beatmapsets. Skipping to next page...")
|
||||
Thread.sleep(this.sleepTimeInMs * 2)
|
||||
return
|
||||
}
|
||||
|
||||
this.logger.debug("Fetched ${searchResults.beatmapsets.size} beatmapsets.")
|
||||
this.logger.debug("Cursor is ${searchResults.cursor_string}")
|
||||
|
||||
for (beatmapset in searchResults.beatmapsets) {
|
||||
|
||||
if (beatmapset.beatmaps.isNullOrEmpty()) {
|
||||
this.logger.error("Beatmapset has no beatmaps.")
|
||||
continue
|
||||
}
|
||||
|
||||
for (beatmap in beatmapset.beatmaps.filter { it.mode == "osu" }) {
|
||||
// Check if it exists in database, and if not, insert.
|
||||
val beatmapExists = dslContext.fetchExists(BEATMAPS, BEATMAPS.BEATMAP_ID.eq(beatmap.id))
|
||||
|
||||
if(beatmapExists) {
|
||||
|
||||
val threeDaysAgo = LocalDateTime
|
||||
.now()
|
||||
.minusDays(3)
|
||||
|
||||
if(dslContext.fetchExists(BEATMAPS, BEATMAPS.BEATMAP_ID.eq(beatmap.id).and(BEATMAPS.SYS_LAST_UPDATE.greaterOrEqual(threeDaysAgo)))) {
|
||||
this.statistics.beatmapsSkippedBecauseTooRecent++
|
||||
this.logger.debug("Skipping beatmap since it's been updated in the last few days.")
|
||||
this.logger.debug("Beatmap ID: ${beatmap.id} | Beatmapset ID: ${beatmapset.id}")
|
||||
continue
|
||||
}
|
||||
}
|
||||
|
||||
if (!beatmapExists) {
|
||||
dslContext.insertInto(BEATMAPS)
|
||||
.set(BEATMAPS.BEATMAP_ID, beatmap.id)
|
||||
.set(BEATMAPS.BEATMAPSET_ID, beatmapset.id.toInt())
|
||||
.set(BEATMAPS.STAR_RATING, beatmap.difficulty_rating)
|
||||
.set(BEATMAPS.VERSION, beatmap.version)
|
||||
.set(BEATMAPS.ARTIST, beatmapset.artist)
|
||||
.set(BEATMAPS.SOURCE, beatmapset.source)
|
||||
.set(BEATMAPS.TITLE, beatmapset.title)
|
||||
.set(BEATMAPS.SOURCE, beatmapset.source)
|
||||
.set(BEATMAPS.CREATOR, beatmapset.creator)
|
||||
.execute()
|
||||
this.statistics.beatmapsAddedToDatabase++
|
||||
}
|
||||
|
||||
val beatmapScores = this.osuApi.getTopBeatmapScores(beatmapId = beatmap.id)
|
||||
Thread.sleep(this.sleepTimeInMs)
|
||||
|
||||
if (beatmapScores == null) {
|
||||
this.logger.error("Failed to fetch beatmap scores for beatmapId = ${beatmap.id}")
|
||||
Thread.sleep(this.sleepTimeInMs * 2)
|
||||
continue
|
||||
}
|
||||
|
||||
if (beatmapScores.scores.isEmpty()) {
|
||||
this.logger.error("Beatmap has no scores. Deleting for good measure...")
|
||||
dslContext.deleteFrom(BEATMAPS)
|
||||
.where(BEATMAPS.BEATMAP_ID.eq(beatmap.id))
|
||||
.execute()
|
||||
continue
|
||||
}
|
||||
|
||||
this.statistics.totalScores = beatmapScores.scores.size
|
||||
|
||||
for (score in beatmapScores.scores) {
|
||||
this.statistics.currentScore++
|
||||
this.logger.debug("Processing score: ${this.statistics.currentScore}/${beatmapScores.scores.size}")
|
||||
|
||||
val userExists = dslContext.fetchExists(USERS, USERS.USER_ID.eq(score.user_id), USERS.SYS_LAST_UPDATE.greaterOrEqual(LocalDateTime.now().minusDays(UPDATE_USER_EVERY_DAYS)))
|
||||
if(!userExists) {
|
||||
this.userToUpdateBucket.add(score.user_id)
|
||||
|
||||
if(this.userToUpdateBucket.size >= 50) {
|
||||
val usersBucket = this.osuApi.getUsersBatch(this.userToUpdateBucket.toList())
|
||||
Thread.sleep(this.sleepTimeInMs)
|
||||
if(usersBucket == null) {
|
||||
this.logger.error("Failed to fetch users batch.")
|
||||
continue
|
||||
} else {
|
||||
for (user in usersBucket.users) {
|
||||
this.statistics.usersAddedToDatabase++
|
||||
this.userService.insertApiUser(user)
|
||||
}
|
||||
}
|
||||
|
||||
this.userToUpdateBucket.clear()
|
||||
}
|
||||
|
||||
this.cacheService.setVariable("userToUpdateBucket", this.userToUpdateBucket)
|
||||
}
|
||||
|
||||
val scoreExists = dslContext.fetchExists(SCORES, SCORES.REPLAY_ID.eq(score.best_id))
|
||||
if (scoreExists) {
|
||||
this.statistics.scoresSkippedBecauseAlreadyExists++
|
||||
this.logger.debug("Cool! Score exists, skipping to next.")
|
||||
continue
|
||||
}
|
||||
|
||||
this.updateScoreWeaving(beatmap, score)
|
||||
}
|
||||
|
||||
checkReplaySimilarity(beatmap.id)
|
||||
|
||||
dslContext.update(BEATMAPS)
|
||||
.set(BEATMAPS.SYS_LAST_UPDATE, LocalDateTime.now())
|
||||
.where(BEATMAPS.BEATMAP_ID.eq(beatmap.id))
|
||||
.execute()
|
||||
}
|
||||
}
|
||||
|
||||
cursor = searchResults.cursor
|
||||
} while (cursor != null)
|
||||
}
|
||||
|
||||
@Serializable
|
||||
data class ReplayDto(
|
||||
val replayId: Long,
|
||||
val replayMods: Int,
|
||||
val replayData: String
|
||||
)
|
||||
|
||||
val sw = StopWatch()
|
||||
|
||||
private fun checkReplaySimilarity(beatmapId: Int) {
|
||||
val allReplays = dslContext.select(
|
||||
SCORES.REPLAY_ID.`as`("replayId"),
|
||||
SCORES.MODS.`as`("replayMods"),
|
||||
SCORES.REPLAY.`as`("replayData")
|
||||
)
|
||||
.from(SCORES)
|
||||
.where(SCORES.BEATMAP_ID.eq(beatmapId))
|
||||
.and(SCORES.REPLAY.isNotNull)
|
||||
.and(SCORES.IS_BANNED.isFalse)
|
||||
.fetchInto(ReplayDto::class.java)
|
||||
|
||||
sw.start("konata")
|
||||
val replaysForKonata = allReplays.map {
|
||||
Replay(string = it.replayData, id = it.replayId, mods = it.replayMods)
|
||||
}.toTypedArray()
|
||||
val konataResults: List<ReplaySetComparison> = compareReplaySet(replaysForKonata)
|
||||
|
||||
sw.stop()
|
||||
|
||||
this.logger.info("Obtained result from Konata in ${sw.lastTaskInfo().timeSeconds}s for ${allReplays.size} replays.")
|
||||
this.logger.info("Pairs/s = ${konataResults.size / sw.lastTaskInfo().timeSeconds}")
|
||||
|
||||
val queries = mutableListOf<Query>()
|
||||
|
||||
for(similarityEntry in konataResults) {
|
||||
if(similarityEntry.similarity < 10 || similarityEntry.correlation > 0.997) {
|
||||
|
||||
var cgSimilarity: Double? = null
|
||||
var cgCorrelation: Double? = null
|
||||
|
||||
try {
|
||||
|
||||
val replayDto1 = ReplayDto(
|
||||
replayId = similarityEntry.replay1Id,
|
||||
replayMods = similarityEntry.replay1Mods,
|
||||
replayData = allReplays.find { it.replayId == similarityEntry.replay1Id }!!.replayData
|
||||
)
|
||||
|
||||
val replayDto2 = ReplayDto(
|
||||
replayId = similarityEntry.replay2Id,
|
||||
replayMods = similarityEntry.replay2Mods,
|
||||
replayData = allReplays.find { it.replayId == similarityEntry.replay2Id }!!.replayData
|
||||
)
|
||||
|
||||
val cgResult = circleguardService.processSimilarity(listOf(replayDto1, replayDto2))
|
||||
.get()
|
||||
|
||||
cgSimilarity = cgResult.result.first().similarity
|
||||
cgCorrelation = cgResult.result.first().correlation
|
||||
} catch (exception: Exception) {
|
||||
this.logger.error("Failed to process similarity with circleguard.")
|
||||
this.logger.error(exception.stackTraceToString())
|
||||
}
|
||||
|
||||
queries.add(dslContext
|
||||
.insertInto(SCORES_SIMILARITY)
|
||||
.set(SCORES_SIMILARITY.BEATMAP_ID, beatmapId)
|
||||
.set(SCORES_SIMILARITY.REPLAY_ID_1, similarityEntry.replay1Id)
|
||||
.set(SCORES_SIMILARITY.REPLAY_ID_2, similarityEntry.replay2Id)
|
||||
.set(SCORES_SIMILARITY.SIMILARITY, similarityEntry.similarity)
|
||||
.set(SCORES_SIMILARITY.CORRELATION, similarityEntry.correlation)
|
||||
.set(SCORES_SIMILARITY.CREATED_AT, LocalDateTime.now())
|
||||
.set(SCORES_SIMILARITY.CG_SIMILARITY, cgSimilarity)
|
||||
.set(SCORES_SIMILARITY.CG_CORRELATION, cgCorrelation)
|
||||
.onDuplicateKeyIgnore()
|
||||
)
|
||||
|
||||
// We insert the userId of the newest replay in the queue for check
|
||||
val scoreForReplay1 = dslContext.select(SCORES.USER_ID, SCORES.DATE)
|
||||
.from(SCORES)
|
||||
.where(SCORES.REPLAY_ID.eq(similarityEntry.replay1Id))
|
||||
.fetchOneInto(ScoresRecord::class.java)
|
||||
|
||||
val scoreForReplay2 = dslContext.select(SCORES.USER_ID, SCORES.DATE)
|
||||
.from(SCORES)
|
||||
.where(SCORES.REPLAY_ID.eq(similarityEntry.replay2Id))
|
||||
.fetchOneInto(ScoresRecord::class.java)
|
||||
|
||||
if(scoreForReplay1 != null && scoreForReplay2 != null) {
|
||||
if(scoreForReplay1.date != null && scoreForReplay2.date != null) {
|
||||
if(scoreForReplay1.date!!.isAfter(scoreForReplay2.date)) {
|
||||
this.updateUserQueueService.insertUser(scoreForReplay1.userId!!)
|
||||
} else {
|
||||
this.updateUserQueueService.insertUser(scoreForReplay2.userId!!)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
dslContext.batch(queries).execute()
|
||||
}
|
||||
|
||||
private fun insertAndProcessNewScore(beatmapId: Int, score: OsuApiModels.Score) {
|
||||
// Check if the score is already in the database
|
||||
val scoreExists = dslContext.fetchExists(SCORES, SCORES.REPLAY_ID.eq(score.best_id))
|
||||
if (scoreExists) {
|
||||
this.statistics.scoresSkippedBecauseAlreadyExists++
|
||||
this.logger.debug("Cool! Score exists, skipping to next.")
|
||||
return
|
||||
}
|
||||
|
||||
dslContext.insertInto(SCORES)
|
||||
.set(SCORES.BEATMAP_ID, beatmapId)
|
||||
.set(SCORES.COUNT_300, score.statistics.count_300)
|
||||
.set(SCORES.COUNT_100, score.statistics.count_100)
|
||||
.set(SCORES.COUNT_50, score.statistics.count_50)
|
||||
.set(SCORES.COUNT_MISS, score.statistics.count_miss)
|
||||
.set(SCORES.COUNT_MISS, score.statistics.count_miss)
|
||||
.set(SCORES.DATE, OffsetDateTime.parse(score.created_at).toLocalDateTime())
|
||||
.set(SCORES.MAX_COMBO, score.max_combo)
|
||||
.set(SCORES.RANK, "Grade.${score.rank.name}")
|
||||
.set(SCORES.MODS, Mod.combineModStrings(score.mods))
|
||||
.set(SCORES.PERFECT, score.perfect)
|
||||
.set(SCORES.PP, score.pp)
|
||||
.set(SCORES.SCORE, score.score)
|
||||
.set(SCORES.REPLAY_AVAILABLE, score.replay)
|
||||
.set(SCORES.REPLAY_ID, score.best_id)
|
||||
.set(SCORES.USER_ID, score.user_id)
|
||||
.set(SCORES.VERSION, CURRENT_VERSION)
|
||||
.execute()
|
||||
|
||||
this.statistics.scoresAddedToDatabase++
|
||||
|
||||
if (score.best_id == null || !score.replay) {
|
||||
this.logger.debug("No replay available, skipping score.")
|
||||
return
|
||||
}
|
||||
|
||||
val scoreReplay = this.osuApi.getReplay(scoreId = score.best_id)
|
||||
|
||||
if(scoreReplay == null || scoreReplay.content.isBlank()) {
|
||||
this.logger.error("Failed to fetch replay with score_id = $score.best_id")
|
||||
return
|
||||
}
|
||||
|
||||
// It's limited to 10 requests per minute according to @ https://github.com/ppy/osu-api/wiki#get-replay-data
|
||||
// So, we sleep for 6 seconds.
|
||||
Thread.sleep(6000)
|
||||
|
||||
// Calculate UR
|
||||
val processedReplay: CircleguardService.ReplayResponse? = try {
|
||||
this.circleguardService.processReplay(
|
||||
replayData = scoreReplay.content, beatmapId = beatmapId, mods = Mod.combineModStrings(score.mods)
|
||||
).get()
|
||||
} catch (e: Exception) {
|
||||
this.logger.error("Circleguard failed to process replay with score_id: ${score.id}")
|
||||
return
|
||||
}
|
||||
|
||||
if (processedReplay?.error_skewness == null || processedReplay.judgements.isEmpty()) {
|
||||
this.logger.error("Circleguard returned null and failed to process replay with score_id: ${score.id}")
|
||||
return
|
||||
}
|
||||
|
||||
val scoreId = dslContext.update(SCORES)
|
||||
.set(SCORES.REPLAY, scoreReplay.content.toByteArray())
|
||||
.set(SCORES.UR, processedReplay.ur)
|
||||
.set(SCORES.ADJUSTED_UR, processedReplay.adjusted_ur)
|
||||
.set(SCORES.FRAMETIME, processedReplay.frametime)
|
||||
.set(SCORES.MEAN_ERROR, processedReplay.mean_error)
|
||||
.set(SCORES.ERROR_VARIANCE, processedReplay.error_variance)
|
||||
.set(SCORES.ERROR_STANDARD_DEVIATION, processedReplay.error_standard_deviation)
|
||||
.set(SCORES.MINIMUM_ERROR, processedReplay.minimum_error)
|
||||
.set(SCORES.MAXIMUM_ERROR, processedReplay.maximum_error)
|
||||
.set(SCORES.ERROR_RANGE, processedReplay.error_range)
|
||||
.set(SCORES.ERROR_COEFFICIENT_OF_VARIATION, processedReplay.error_coefficient_of_variation)
|
||||
.set(SCORES.ERROR_KURTOSIS, processedReplay.error_kurtosis)
|
||||
.set(SCORES.ERROR_SKEWNESS, processedReplay.error_skewness)
|
||||
.set(SCORES.SNAPS, processedReplay.snaps)
|
||||
.set(SCORES.EDGE_HITS, processedReplay.edge_hits)
|
||||
.where(SCORES.REPLAY_ID.eq(score.best_id))
|
||||
.returningResult(SCORES.ID)
|
||||
.fetchOne()?.getValue(SCORES.ID)
|
||||
|
||||
val replayData = this.scoreService.getReplayData(replayId = score.best_id)
|
||||
if(replayData == null) {
|
||||
this.logger.error("Weird, failed to fetch score with replay_id = ${score.best_id}")
|
||||
} else {
|
||||
messagingTemplate.convertAndSend(
|
||||
"/topic/live-scores/",
|
||||
replayData
|
||||
)
|
||||
}
|
||||
|
||||
this.statistics.scoresWithReplayAndAnalyzed++
|
||||
|
||||
if (scoreId == null) {
|
||||
this.logger.error("Weird, failed to insert score into scores table. At least, it did not return an ID.")
|
||||
return
|
||||
}
|
||||
|
||||
if(processedReplay.ur != null && processedReplay.ur < 25.0) {
|
||||
this.logger.info("Inserting user into queue for update: ${score.user_id}")
|
||||
this.logger.info("UR: ${processedReplay.ur} on their replay with id = ${score.best_id}")
|
||||
this.updateUserQueueService.insertUser(score.user_id)
|
||||
}
|
||||
|
||||
for (judgement in processedReplay.judgements) {
|
||||
dslContext.insertInto(SCORES_JUDGEMENTS)
|
||||
.set(SCORES_JUDGEMENTS.TIME, judgement.time)
|
||||
.set(SCORES_JUDGEMENTS.X, judgement.x)
|
||||
.set(SCORES_JUDGEMENTS.Y, judgement.y)
|
||||
.set(SCORES_JUDGEMENTS.TYPE, fromJudgementType(judgement.type))
|
||||
.set(SCORES_JUDGEMENTS.DISTANCE_EDGE, judgement.distance_edge)
|
||||
.set(SCORES_JUDGEMENTS.DISTANCE_CENTER, judgement.distance_center)
|
||||
.set(SCORES_JUDGEMENTS.ERROR, judgement.error)
|
||||
.set(SCORES_JUDGEMENTS.SCORE_ID, scoreId)
|
||||
.execute()
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,223 @@
|
||||
package com.nisemoe.nise.scheduler
|
||||
|
||||
import com.nisemoe.generated.tables.records.ScoresRecord
|
||||
import com.nisemoe.generated.tables.references.BEATMAPS
|
||||
import com.nisemoe.generated.tables.references.SCORES
|
||||
import com.nisemoe.generated.tables.references.SCORES_SIMILARITY
|
||||
import com.nisemoe.nise.Format
|
||||
import com.nisemoe.nise.database.ScoreService
|
||||
import com.nisemoe.nise.integrations.DiscordEmbed
|
||||
import com.nisemoe.nise.integrations.DiscordService
|
||||
import com.nisemoe.nise.osu.Mod
|
||||
import org.jooq.DSLContext
|
||||
import org.jooq.Record
|
||||
import org.jooq.impl.DSL
|
||||
import org.slf4j.LoggerFactory
|
||||
import org.springframework.beans.factory.annotation.Value
|
||||
import org.springframework.context.annotation.Profile
|
||||
import org.springframework.scheduling.annotation.Scheduled
|
||||
import org.springframework.stereotype.Service
|
||||
import java.time.LocalDateTime
|
||||
import kotlin.math.roundToInt
|
||||
|
||||
@Profile("discord")
|
||||
@Service
|
||||
class SendScoresToDiscord(
|
||||
private val dslContext: DSLContext,
|
||||
private val scoreService: ScoreService,
|
||||
private val discordService: DiscordService
|
||||
) {
|
||||
|
||||
@Value("\${SCORES_WEBHOOK_URL}")
|
||||
private lateinit var webhookUrl: String
|
||||
|
||||
private val logger = LoggerFactory.getLogger(javaClass)
|
||||
|
||||
@Scheduled(fixedDelay = 1200000, initialDelay = 0)
|
||||
fun sendScoresToDiscord() {
|
||||
dslContext.selectFrom(SCORES)
|
||||
.where(SCORES.SENT_DISCORD_NOTIFICATION.isFalse)
|
||||
.or(SCORES.SENT_DISCORD_NOTIFICATION.isNull)
|
||||
.and(SCORES.ADJUSTED_UR.isNotNull)
|
||||
.and(SCORES.ADJUSTED_UR.lessOrEqual(25.0))
|
||||
.and(SCORES.IS_BANNED.isFalse)
|
||||
.limit(5)
|
||||
.stream().use { stream ->
|
||||
stream.forEach { score -> processSuspiciousScore(score) }
|
||||
}
|
||||
|
||||
dslContext.select(DSL.asterisk())
|
||||
.from(SCORES_SIMILARITY)
|
||||
|
||||
.join(ScoreService.osuScoreAlias1).on(ScoreService.osuScoreAlias1.REPLAY_ID.eq(SCORES_SIMILARITY.REPLAY_ID_1))
|
||||
.join(ScoreService.osuUserAlias1).on(ScoreService.osuScoreAlias1.USER_ID.eq(ScoreService.osuUserAlias1.USER_ID))
|
||||
|
||||
.leftJoin(ScoreService.osuScoreAlias2).on(ScoreService.osuScoreAlias2.REPLAY_ID.eq(SCORES_SIMILARITY.REPLAY_ID_2))
|
||||
.leftJoin(ScoreService.osuUserAlias2).on(ScoreService.osuScoreAlias2.USER_ID.eq(ScoreService.osuUserAlias2.USER_ID))
|
||||
|
||||
.join(BEATMAPS).on(BEATMAPS.BEATMAP_ID.eq(ScoreService.osuScoreAlias1.BEATMAP_ID))
|
||||
.where(SCORES_SIMILARITY.SENT_DISCORD_NOTIFICATION.isFalse)
|
||||
.and(ScoreService.osuScoreAlias1.IS_BANNED.eq(false))
|
||||
.and(ScoreService.osuScoreAlias2.IS_BANNED.eq(false))
|
||||
.or(SCORES_SIMILARITY.SENT_DISCORD_NOTIFICATION.isNull)
|
||||
.limit(5)
|
||||
.stream().use { stream ->
|
||||
stream.forEach { score -> processStolenReplay(score) }
|
||||
}
|
||||
}
|
||||
|
||||
fun processStolenReplay(scoresSimilarityRecord: Record) {
|
||||
val olderReplayDate: LocalDateTime?
|
||||
val olderReplayPP: Double?
|
||||
val olderReplayId: Long?
|
||||
val olderReplayUsername: String?
|
||||
val olderReplayUserId: Long?
|
||||
val olderReplayMods: Int?
|
||||
|
||||
val newerReplayDate: LocalDateTime?
|
||||
val newerReplayPP: Double?
|
||||
val newerReplayId: Long?
|
||||
val newerReplayUsername: String?
|
||||
val newerReplayUserId: Long?
|
||||
val newerReplayMods: Int?
|
||||
|
||||
val replay1Date = scoresSimilarityRecord.get(ScoreService.osuScoreAlias1.DATE)
|
||||
val replay2Date = scoresSimilarityRecord.get(ScoreService.osuScoreAlias2.DATE)
|
||||
|
||||
if(replay1Date == null || replay2Date == null) {
|
||||
logger.error("Failed to fetch replay dates for scores similarity record with id = ${scoresSimilarityRecord.get(SCORES_SIMILARITY.ID)}")
|
||||
return
|
||||
}
|
||||
|
||||
if (replay1Date.isBefore(replay2Date)) {
|
||||
olderReplayId = scoresSimilarityRecord.get(SCORES_SIMILARITY.REPLAY_ID_1)
|
||||
olderReplayUsername = scoresSimilarityRecord.get(ScoreService.osuUserAlias1.USERNAME)
|
||||
olderReplayUserId = scoresSimilarityRecord.get(ScoreService.osuUserAlias1.USER_ID)
|
||||
olderReplayPP = scoresSimilarityRecord.get(ScoreService.osuScoreAlias1.PP)
|
||||
olderReplayDate = replay1Date
|
||||
olderReplayMods = scoresSimilarityRecord.get(ScoreService.osuScoreAlias1.MODS)
|
||||
|
||||
newerReplayId = scoresSimilarityRecord.get(SCORES_SIMILARITY.REPLAY_ID_2)
|
||||
newerReplayUsername = scoresSimilarityRecord.get(ScoreService.osuUserAlias2.USERNAME)
|
||||
newerReplayUserId = scoresSimilarityRecord.get(ScoreService.osuUserAlias2.USER_ID)
|
||||
newerReplayPP = scoresSimilarityRecord.get(ScoreService.osuScoreAlias2.PP)
|
||||
newerReplayDate = replay2Date
|
||||
newerReplayMods = scoresSimilarityRecord.get(ScoreService.osuScoreAlias2.MODS)
|
||||
} else {
|
||||
olderReplayId = scoresSimilarityRecord.get(SCORES_SIMILARITY.REPLAY_ID_2)
|
||||
olderReplayUsername = scoresSimilarityRecord.get(ScoreService.osuUserAlias2.USERNAME)
|
||||
olderReplayUserId = scoresSimilarityRecord.get(ScoreService.osuUserAlias2.USER_ID)
|
||||
olderReplayPP = scoresSimilarityRecord.get(ScoreService.osuScoreAlias2.PP)
|
||||
olderReplayDate = replay2Date
|
||||
olderReplayMods = scoresSimilarityRecord.get(ScoreService.osuScoreAlias2.MODS)
|
||||
|
||||
newerReplayId = scoresSimilarityRecord.get(SCORES_SIMILARITY.REPLAY_ID_1)
|
||||
newerReplayUsername = scoresSimilarityRecord.get(ScoreService.osuUserAlias1.USERNAME)
|
||||
newerReplayUserId = scoresSimilarityRecord.get(ScoreService.osuUserAlias1.USER_ID)
|
||||
newerReplayPP = scoresSimilarityRecord.get(ScoreService.osuScoreAlias1.PP)
|
||||
newerReplayDate = replay1Date
|
||||
newerReplayMods = scoresSimilarityRecord.get(ScoreService.osuScoreAlias1.MODS)
|
||||
}
|
||||
|
||||
val statisticsEmbed = DiscordEmbed(
|
||||
title = "Possible stolen replay",
|
||||
description = """
|
||||
[Replay 1 Link](https://nise.moe/s/$olderReplayId) | [osu!web](https://osu.ppy.sh/scores/osu/$olderReplayId)
|
||||
- Played by: [$olderReplayUsername](https://osu.ppy.sh/users/$olderReplayUserId)
|
||||
- Played at: ${Format.formatLocalDateTime(olderReplayDate)}
|
||||
- PP: ${olderReplayPP?.roundToInt()}${if (olderReplayMods != null) " (${Mod.print(olderReplayMods)})" else ""}
|
||||
|
||||
[Replay 2 Link](https://nise.moe/s/$newerReplayId) | [osu!web](https://osu.ppy.sh/scores/osu/$newerReplayId)
|
||||
- Played by: [$newerReplayUsername](https://osu.ppy.sh/users/$newerReplayUserId)
|
||||
- Played at: ${Format.formatLocalDateTime(newerReplayDate)}
|
||||
- PP: ${newerReplayPP?.roundToInt()}${if (newerReplayMods != null) " (${Mod.print(newerReplayMods)})" else ""}
|
||||
""".trimIndent(),
|
||||
url = "https://nise.moe/p/$olderReplayId/$newerReplayId"
|
||||
)
|
||||
statisticsEmbed.color = 0xB26ECC
|
||||
statisticsEmbed.setImageUrl("https://assets.ppy.sh/beatmaps/${scoresSimilarityRecord.get(BEATMAPS.BEATMAPSET_ID)}/covers/cover.jpg")
|
||||
statisticsEmbed.setAuthor(
|
||||
name = "/nise.moe/'s discord feed", url = "https://nise.moe/", iconUrl = "https://nise.moe/assets/keisatsu-chan.png"
|
||||
)
|
||||
|
||||
statisticsEmbed.addEmbed(name = "Similarity", value = String.format("%.4f", scoresSimilarityRecord.get(SCORES_SIMILARITY.SIMILARITY)))
|
||||
statisticsEmbed.addEmbed(name = "Correlation", value = String.format("%.4f", scoresSimilarityRecord.get(SCORES_SIMILARITY.CORRELATION)))
|
||||
val beatmapString = "${scoresSimilarityRecord.get(BEATMAPS.TITLE)} by ${scoresSimilarityRecord.get(BEATMAPS.ARTIST)} (${
|
||||
scoresSimilarityRecord.get(BEATMAPS.VERSION)
|
||||
} | ★${String.format("%.2f", scoresSimilarityRecord.get(BEATMAPS.STAR_RATING))})"
|
||||
statisticsEmbed.addEmbed(
|
||||
name = "Beatmap",
|
||||
value = "[$beatmapString](https://osu.ppy.sh/beatmaps/${scoresSimilarityRecord.get(BEATMAPS.BEATMAP_ID)}?mode=osu)"
|
||||
)
|
||||
|
||||
val embedResult = this.discordService.sendEmbeds(
|
||||
this.webhookUrl,
|
||||
listOf(statisticsEmbed)
|
||||
)
|
||||
|
||||
if(embedResult) {
|
||||
dslContext.update(SCORES_SIMILARITY)
|
||||
.set(SCORES_SIMILARITY.SENT_DISCORD_NOTIFICATION, true)
|
||||
.where(SCORES_SIMILARITY.ID.eq(scoresSimilarityRecord.get(SCORES_SIMILARITY.ID)))
|
||||
.execute()
|
||||
} else {
|
||||
this.logger.error("Failed to send discord webhook.")
|
||||
}
|
||||
|
||||
// Sleep for 30 seconds between webhook sends
|
||||
Thread.sleep(30_000)
|
||||
}
|
||||
|
||||
fun processSuspiciousScore(score: ScoresRecord) {
|
||||
val replayData = this.scoreService.getReplayData(replayId = score.replayId!!)
|
||||
if(replayData == null) {
|
||||
logger.error("Failed to fetch score with replayId = ${score.replayId}")
|
||||
return
|
||||
}
|
||||
|
||||
val statisticsEmbed = DiscordEmbed(
|
||||
title = "New suspicious score",
|
||||
description = "[osu!web](https://osu.ppy.sh/scores/osu/${score.replayId})",
|
||||
url = "https://nise.moe/s/" + score.replayId,
|
||||
)
|
||||
statisticsEmbed.color = 0xDB6D4E
|
||||
statisticsEmbed.setImageUrl("https://assets.ppy.sh/beatmaps/${replayData.beatmap_beatmapset_id}/covers/cover.jpg")
|
||||
statisticsEmbed.setAuthor(
|
||||
name = "/nise.moe/'s discord feed", url = "https://nise.moe/", iconUrl = "https://nise.moe/assets/keisatsu-chan.png"
|
||||
)
|
||||
|
||||
statisticsEmbed.addEmbed(name = "Played by", value = "[${replayData.username}](https://osu.ppy.sh/users/${replayData.user_id})")
|
||||
statisticsEmbed.addEmbed(name = "Played at", value = replayData.date)
|
||||
statisticsEmbed.addEmbed(name = "PP", value = replayData.pp.roundToInt().toString())
|
||||
if(replayData.mods.isNotEmpty())
|
||||
statisticsEmbed.addEmbed(name = "Mods", value = replayData.mods.joinToString(""))
|
||||
statisticsEmbed.addEmbed(name = "Max Combo", value = "${replayData.max_combo}x")
|
||||
statisticsEmbed.addEmbed(name = "Accuracy", value = String.format("%.2f", replayData.calculateAccuracy()) + "%")
|
||||
statisticsEmbed.addEmbed(name = "cvUR | Adj. cvUR", value = String.format("%.2f", replayData.ur) + " | " + String.format("%.2f", replayData.adjusted_ur))
|
||||
val beatmapString = "${replayData.beatmap_title} by ${replayData.beatmap_artist} (${replayData.beatmap_version} | ★${
|
||||
String.format(
|
||||
"%.2f",
|
||||
replayData.beatmap_star_rating
|
||||
)
|
||||
})"
|
||||
statisticsEmbed.addEmbed(name = "Beatmap", value = "[$beatmapString](https://osu.ppy.sh/beatmaps/${replayData.beatmap_id}?mode=osu)")
|
||||
|
||||
val embedResult = this.discordService.sendEmbeds(
|
||||
this.webhookUrl,
|
||||
listOf(statisticsEmbed)
|
||||
)
|
||||
|
||||
if(embedResult) {
|
||||
dslContext.update(SCORES)
|
||||
.set(SCORES.SENT_DISCORD_NOTIFICATION, true)
|
||||
.where(SCORES.ID.eq(score.id))
|
||||
.execute()
|
||||
} else {
|
||||
this.logger.error("Failed to send discord webhook.")
|
||||
}
|
||||
|
||||
// Sleep for 30 seconds between webhook sends
|
||||
Thread.sleep(30_000)
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,61 @@
|
||||
package com.nisemoe.nise.service
|
||||
|
||||
import org.slf4j.LoggerFactory
|
||||
import org.springframework.data.redis.core.RedisTemplate
|
||||
import org.springframework.stereotype.Service
|
||||
import java.time.Duration
|
||||
import java.time.LocalDateTime
|
||||
|
||||
@Service
|
||||
class CacheService(
|
||||
private val redisTemplate: RedisTemplate<Any, Any>
|
||||
) {
|
||||
|
||||
private val logger = LoggerFactory.getLogger(javaClass)
|
||||
|
||||
fun setVariable(key: String, value: Any, ttl: Duration = Duration.ofDays(1)) {
|
||||
redisTemplate.opsForValue().set(key, value, ttl)
|
||||
}
|
||||
|
||||
fun deleteVariable(key: String) {
|
||||
val deleted = redisTemplate.delete(key)
|
||||
if(!deleted)
|
||||
this.logger.error("Failed to delete key: $key")
|
||||
}
|
||||
|
||||
fun <T> getVariable(key: String, returnType: Class<T>): T? {
|
||||
val result = try {
|
||||
redisTemplate.opsForValue().get(key)
|
||||
} catch (e: Exception) {
|
||||
this.logger.error("Error while getting variable from cache: $e")
|
||||
this.deleteVariable(key)
|
||||
null
|
||||
}
|
||||
|
||||
if(result == null) {
|
||||
this.logger.debug("No value found for key: $key")
|
||||
return null
|
||||
}
|
||||
|
||||
if (LocalDateTime::class.java.isAssignableFrom(returnType) && result is ArrayList<*>) {
|
||||
@Suppress("UNCHECKED_CAST")
|
||||
return LocalDateTime.of(
|
||||
result[0] as Int,
|
||||
result[1] as Int,
|
||||
result[2] as Int,
|
||||
result[3] as Int,
|
||||
result[4] as Int,
|
||||
result[5] as Int
|
||||
) as T
|
||||
}
|
||||
|
||||
if (returnType.isInstance(result))
|
||||
return returnType.cast(result)
|
||||
|
||||
this.logger.error("Failed to deserialize cache entry.")
|
||||
this.logger.error("Expected type ${returnType.simpleName}, got ${result::class.simpleName}")
|
||||
this.deleteVariable(key)
|
||||
return null
|
||||
}
|
||||
|
||||
}
|
||||
@ -0,0 +1,44 @@
|
||||
package com.nisemoe.nise.service
|
||||
|
||||
import com.nisemoe.generated.tables.references.UPDATE_USER_QUEUE
|
||||
import org.jooq.DSLContext
|
||||
import org.springframework.stereotype.Service
|
||||
import java.time.LocalDateTime
|
||||
|
||||
@Service
|
||||
class UpdateUserQueueService(
|
||||
private val dslContext: DSLContext
|
||||
) {
|
||||
|
||||
fun getQueue(): List<Long> {
|
||||
return dslContext.select(UPDATE_USER_QUEUE.USER_ID)
|
||||
.from(UPDATE_USER_QUEUE)
|
||||
.where(UPDATE_USER_QUEUE.PROCESSED.isFalse)
|
||||
.orderBy(UPDATE_USER_QUEUE.CREATED_AT.asc())
|
||||
.fetchInto(Long::class.java)
|
||||
}
|
||||
|
||||
fun insertUser(userId: Long) {
|
||||
val exists = dslContext.fetchExists(UPDATE_USER_QUEUE,
|
||||
UPDATE_USER_QUEUE.USER_ID.eq(userId),
|
||||
UPDATE_USER_QUEUE.PROCESSED.isFalse
|
||||
)
|
||||
if (exists)
|
||||
return
|
||||
|
||||
dslContext.insertInto(UPDATE_USER_QUEUE)
|
||||
.set(UPDATE_USER_QUEUE.USER_ID, userId)
|
||||
.execute()
|
||||
}
|
||||
|
||||
fun setUserAsProcessed(userId: Long) {
|
||||
dslContext.update(UPDATE_USER_QUEUE)
|
||||
.set(UPDATE_USER_QUEUE.PROCESSED, true)
|
||||
.set(UPDATE_USER_QUEUE.PROCESSED_AT, LocalDateTime.now())
|
||||
.where(UPDATE_USER_QUEUE.USER_ID.eq(userId))
|
||||
.and(UPDATE_USER_QUEUE.PROCESSED.isFalse)
|
||||
.execute()
|
||||
}
|
||||
|
||||
|
||||
}
|
||||
@ -0,0 +1 @@
|
||||
logging.level.com.nisemoe=DEBUG
|
||||
@ -0,0 +1,14 @@
|
||||
spring.datasource.url=jdbc:postgresql://${POSTGRES_HOST:postgres}:${POSTGRES_PORT:5432}/${POSTGRES_DB:postgres}?currentSchema=public
|
||||
spring.datasource.username=${POSTGRES_USER:postgres}
|
||||
spring.datasource.password=${POSTGRES_PASS:postgres}
|
||||
spring.datasource.driver-class-name=org.postgresql.Driver
|
||||
spring.datasource.name=HikariPool-PostgreSQL
|
||||
|
||||
spring.flyway.enabled=true
|
||||
spring.flyway.schemas=public
|
||||
|
||||
# Batching
|
||||
spring.datasource.hikari.data-source-properties.prepStmtCacheSize=250
|
||||
spring.datasource.hikari.data-source-properties.prepStmtCacheSqlLimit=2048
|
||||
spring.datasource.hikari.data-source-properties.useServerPrepStmts=true
|
||||
spring.datasource.hikari.data-source-properties.rewriteBatchedStatements=true
|
||||
18
nise-backend/src/main/resources/application.properties
Normal file
18
nise-backend/src/main/resources/application.properties
Normal file
@ -0,0 +1,18 @@
|
||||
server.port=${SERVER_PORT:8080}
|
||||
|
||||
server.compression.enabled=true
|
||||
|
||||
server.compression.mime-types=text/html,text/xml,text/plain,text/css,text/javascript,application/javascript,application/json
|
||||
|
||||
server.compression.min-response-size=1024
|
||||
|
||||
spring.servlet.multipart.enabled=true
|
||||
spring.servlet.multipart.max-file-size=8MB
|
||||
|
||||
server.http2.enabled=true
|
||||
|
||||
# Redis
|
||||
spring.data.redis.host=${REDIS_HOST:redis}
|
||||
spring.data.redis.port=${REDIS_PORT:6379}
|
||||
spring.data.redis.repositories.enabled=false
|
||||
spring.data.redis.database=${REDIS_DB:2}
|
||||
@ -0,0 +1,185 @@
|
||||
CREATE TYPE public."judgement_type" AS ENUM (
|
||||
'300',
|
||||
'100',
|
||||
'50',
|
||||
'Miss');
|
||||
CREATE SEQUENCE public.beatmaps_beatmap_id_seq
|
||||
INCREMENT BY 1
|
||||
MINVALUE 1
|
||||
MAXVALUE 2147483647
|
||||
START 1
|
||||
CACHE 1
|
||||
NO CYCLE;
|
||||
|
||||
CREATE SEQUENCE public.scores_id_seq
|
||||
INCREMENT BY 1
|
||||
MINVALUE 1
|
||||
MAXVALUE 2147483647
|
||||
START 1
|
||||
CACHE 1
|
||||
NO CYCLE;
|
||||
|
||||
CREATE SEQUENCE public.scores_judgements_id_seq
|
||||
INCREMENT BY 1
|
||||
MINVALUE 1
|
||||
MAXVALUE 2147483647
|
||||
START 1
|
||||
CACHE 1
|
||||
NO CYCLE;
|
||||
|
||||
CREATE SEQUENCE public.scores_similarity_id_seq
|
||||
INCREMENT BY 1
|
||||
MINVALUE 1
|
||||
MAXVALUE 2147483647
|
||||
START 1
|
||||
CACHE 1
|
||||
NO CYCLE;
|
||||
|
||||
CREATE SEQUENCE public.users_user_id_seq
|
||||
INCREMENT BY 1
|
||||
MINVALUE 1
|
||||
MAXVALUE 9223372036854775807
|
||||
START 1
|
||||
CACHE 1
|
||||
NO CYCLE;
|
||||
|
||||
CREATE TABLE public.beatmaps
|
||||
(
|
||||
beatmap_id serial4 NOT NULL,
|
||||
approach_rate float8 NULL,
|
||||
approved varchar NULL,
|
||||
approved_date timestamp NULL,
|
||||
artist varchar NULL,
|
||||
audio_unavailable bool NULL,
|
||||
beatmap_hash varchar NULL,
|
||||
beatmapset_id int4 NULL,
|
||||
bpm varchar NULL,
|
||||
circle_size float8 NULL,
|
||||
count_hitcircles int4 NULL,
|
||||
count_sliders int4 NULL,
|
||||
count_spinners int4 NULL,
|
||||
creator varchar NULL,
|
||||
creator_id int4 NULL,
|
||||
download_unavailable bool NULL,
|
||||
favourite_count int4 NULL,
|
||||
genre_id int4 NULL,
|
||||
health float8 NULL,
|
||||
hit_length float8 NULL,
|
||||
language_id int4 NULL,
|
||||
last_update timestamp NULL,
|
||||
max_combo int4 NULL,
|
||||
"mode" int4 NULL,
|
||||
overrall_difficulty float8 NULL,
|
||||
passcount int4 NULL,
|
||||
playcount int4 NULL,
|
||||
rating float8 NULL,
|
||||
"source" varchar NULL,
|
||||
star_rating float8 NULL,
|
||||
stars_aim float8 NULL,
|
||||
stars_speed float8 NULL,
|
||||
storyboard bool NULL,
|
||||
submit_date timestamp NULL,
|
||||
tags varchar NULL,
|
||||
title varchar NULL,
|
||||
total_length float8 NULL,
|
||||
"version" varchar NULL,
|
||||
video bool NULL,
|
||||
sys_last_update timestamp NULL,
|
||||
CONSTRAINT beatmaps_pkey PRIMARY KEY (beatmap_id)
|
||||
);
|
||||
|
||||
CREATE TABLE public.reddit_post
|
||||
(
|
||||
post_id varchar NOT NULL,
|
||||
title varchar NULL,
|
||||
created_utc float8 NULL,
|
||||
url varchar NULL,
|
||||
is_checked bool NULL DEFAULT false,
|
||||
CONSTRAINT reddit_post_pkey PRIMARY KEY (post_id)
|
||||
);
|
||||
|
||||
CREATE TABLE public.scores
|
||||
(
|
||||
id serial4 NOT NULL,
|
||||
beatmap_id int4 NULL,
|
||||
count_100 int4 NULL,
|
||||
count_300 int4 NULL,
|
||||
count_50 int4 NULL,
|
||||
count_geki int4 NULL,
|
||||
count_katu int4 NULL,
|
||||
count_miss int4 NULL,
|
||||
"date" timestamp NULL,
|
||||
max_combo int4 NULL,
|
||||
mods int4 NULL,
|
||||
perfect bool NULL,
|
||||
pp float8 NULL,
|
||||
"rank" varchar NULL,
|
||||
replay_available bool NULL,
|
||||
replay_id int8 NULL,
|
||||
score int8 NULL,
|
||||
user_id int8 NULL,
|
||||
username varchar NULL,
|
||||
replay bytea NULL,
|
||||
ur float8 NULL,
|
||||
frametime float8 NULL,
|
||||
hits int4 NULL,
|
||||
snaps int4 NULL,
|
||||
is_banned bool NULL DEFAULT false,
|
||||
CONSTRAINT scores_pkey PRIMARY KEY (id)
|
||||
);
|
||||
|
||||
CREATE TABLE public.scores_similarity
|
||||
(
|
||||
id serial4 NOT NULL,
|
||||
beatmap_id int4 NULL,
|
||||
replay_id_1 int8 NULL,
|
||||
replay_id_2 int8 NULL,
|
||||
similarity float8 NULL,
|
||||
correlation float8 NULL,
|
||||
created_at timestamp NULL,
|
||||
CONSTRAINT scores_similarity_pkey PRIMARY KEY (id)
|
||||
);
|
||||
CREATE INDEX idx_replay_ids ON public.scores_similarity USING btree (replay_id_1, replay_id_2);
|
||||
CREATE INDEX idx_replay_ids_pairs ON public.scores_similarity USING btree (replay_id_1, replay_id_2);
|
||||
|
||||
CREATE TABLE public.users
|
||||
(
|
||||
user_id bigserial NOT NULL,
|
||||
username varchar NULL,
|
||||
join_date timestamp NULL,
|
||||
country varchar NULL,
|
||||
country_rank int8 NULL,
|
||||
"rank" int8 NULL,
|
||||
pp_raw float8 NULL,
|
||||
"level" float8 NULL,
|
||||
accuracy float8 NULL,
|
||||
playcount int8 NULL,
|
||||
total_score int8 NULL,
|
||||
ranked_score int8 NULL,
|
||||
seconds_played int8 NULL,
|
||||
count_100 int8 NULL,
|
||||
count_300 int8 NULL,
|
||||
count_50 int8 NULL,
|
||||
count_rank_a int8 NULL,
|
||||
count_rank_s int8 NULL,
|
||||
count_rank_sh int8 NULL,
|
||||
count_rank_ss int8 NULL,
|
||||
count_rank_ssh int8 NULL,
|
||||
sys_last_update timestamp NULL,
|
||||
CONSTRAINT users_pkey PRIMARY KEY (user_id)
|
||||
);
|
||||
|
||||
CREATE TABLE public.scores_judgements
|
||||
(
|
||||
id serial4 NOT NULL,
|
||||
"time" float8 NULL,
|
||||
x float8 NULL,
|
||||
y float8 NULL,
|
||||
"type" public."judgement_type" NULL,
|
||||
distance_center float8 NULL,
|
||||
distance_edge float8 NULL,
|
||||
error float8 NULL,
|
||||
score_id int4 NULL,
|
||||
CONSTRAINT scores_judgements_pkey PRIMARY KEY (id),
|
||||
CONSTRAINT scores_judgements_score_id_fkey FOREIGN KEY (score_id) REFERENCES public.scores (id) ON DELETE CASCADE
|
||||
);
|
||||
@ -0,0 +1,7 @@
|
||||
ALTER TABLE public.scores DROP COLUMN count_geki;
|
||||
ALTER TABLE public.scores DROP COLUMN count_katu;
|
||||
ALTER TABLE public.scores DROP COLUMN username;
|
||||
|
||||
ALTER TABLE public.scores RENAME COLUMN hits TO edge_hits;
|
||||
|
||||
ALTER TABLE public.scores ADD COLUMN adjusted_ur float8;
|
||||
@ -0,0 +1,31 @@
|
||||
ALTER TABLE public.beatmaps DROP COLUMN approach_rate;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN approved;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN approved_date;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN audio_unavailable;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN beatmap_hash;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN bpm;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN circle_size;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN count_hitcircles;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN count_sliders;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN count_spinners;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN creator_id;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN download_unavailable;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN favourite_count;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN genre_id;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN health;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN hit_length;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN language_id;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN last_update;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN max_combo;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN "mode";
|
||||
ALTER TABLE public.beatmaps DROP COLUMN overrall_difficulty;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN passcount;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN playcount;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN rating;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN stars_aim;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN stars_speed;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN storyboard;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN submit_date;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN tags;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN total_length;
|
||||
ALTER TABLE public.beatmaps DROP COLUMN video;
|
||||
@ -0,0 +1,6 @@
|
||||
ALTER TABLE public.users DROP COLUMN "level";
|
||||
ALTER TABLE public.users DROP COLUMN count_rank_a;
|
||||
ALTER TABLE public.users DROP COLUMN count_rank_s;
|
||||
ALTER TABLE public.users DROP COLUMN count_rank_sh;
|
||||
ALTER TABLE public.users DROP COLUMN count_rank_ss;
|
||||
ALTER TABLE public.users DROP COLUMN count_rank_ssh;
|
||||
@ -0,0 +1,9 @@
|
||||
ALTER TABLE public.scores ADD COLUMN mean_error float8;
|
||||
ALTER TABLE public.scores ADD COLUMN error_variance float8;
|
||||
ALTER TABLE public.scores ADD COLUMN error_standard_deviation float8;
|
||||
ALTER TABLE public.scores ADD COLUMN minimum_error float8;
|
||||
ALTER TABLE public.scores ADD COLUMN maximum_error float8;
|
||||
ALTER TABLE public.scores ADD COLUMN error_range float8;
|
||||
ALTER TABLE public.scores ADD COLUMN error_coefficient_of_variation float8;
|
||||
ALTER TABLE public.scores ADD COLUMN error_kurtosis float8;
|
||||
ALTER TABLE public.scores ADD COLUMN error_skewness float8;
|
||||
@ -0,0 +1 @@
|
||||
ALTER TABLE public.scores ADD COLUMN sent_discord_notification boolean;
|
||||
@ -0,0 +1,2 @@
|
||||
ALTER TABLE public.scores
|
||||
ADD COLUMN added_at timestamp with time zone default current_timestamp;
|
||||
@ -0,0 +1,18 @@
|
||||
-- Index on the SCORES table for USER_ID
|
||||
CREATE INDEX idx_scores_user_id ON SCORES (USER_ID);
|
||||
|
||||
-- Index on the SCORES table for BEATMAP_ID
|
||||
CREATE INDEX idx_scores_beatmap_id ON SCORES (BEATMAP_ID);
|
||||
|
||||
-- Index on the SCORES table for REPLAY_ID
|
||||
CREATE INDEX idx_scores_replay_id ON SCORES (REPLAY_ID);
|
||||
|
||||
-- Composite index on BEATMAP_ID and REPLAY_ID
|
||||
CREATE INDEX idx_scores_beatmap_id_replay_id ON SCORES (BEATMAP_ID, REPLAY_ID);
|
||||
|
||||
CREATE INDEX idx_scores_beatmap_id_replay_id_ur ON SCORES (BEATMAP_ID, REPLAY_ID, UR);
|
||||
|
||||
-- Index on UR column if null values are common and its filtration is beneficial
|
||||
CREATE INDEX idx_scores_ur ON SCORES (UR);
|
||||
|
||||
CREATE INDEX idx_scores_judgements_score_id ON SCORES_JUDGEMENTS (SCORE_ID);
|
||||
@ -0,0 +1,2 @@
|
||||
ALTER TABLE public.scores
|
||||
ADD COLUMN version int default 0;
|
||||
@ -0,0 +1,2 @@
|
||||
ALTER TABLE public.scores
|
||||
ADD CONSTRAINT replay_id_unique UNIQUE (replay_id);
|
||||
@ -0,0 +1,2 @@
|
||||
ALTER TABLE public.scores_similarity
|
||||
ADD CONSTRAINT unique_beatmap_replay_ids UNIQUE (beatmap_id, replay_id_1, replay_id_2);
|
||||
@ -0,0 +1,3 @@
|
||||
ALTER TABLE public.scores_similarity ADD COLUMN sent_discord_notification boolean;
|
||||
|
||||
UPDATE public.scores_similarity SET sent_discord_notification = true;
|
||||
@ -0,0 +1,2 @@
|
||||
ALTER TABLE public.scores_similarity ADD COLUMN cg_similarity float8;
|
||||
ALTER TABLE public.scores_similarity ADD COLUMN cg_correlation float8;
|
||||
@ -0,0 +1,7 @@
|
||||
create table "public".update_user_queue(
|
||||
id serial primary key,
|
||||
user_id int8 not null,
|
||||
processed boolean not null default false,
|
||||
created_at timestamp not null default current_timestamp,
|
||||
processed_at timestamp
|
||||
);
|
||||
@ -0,0 +1,122 @@
|
||||
package com.nisemoe.nise.database
|
||||
|
||||
import com.nisemoe.generated.tables.references.BEATMAPS
|
||||
import com.nisemoe.generated.tables.references.SCORES
|
||||
import com.nisemoe.generated.tables.references.SCORES_SIMILARITY
|
||||
import com.nisemoe.generated.tables.references.USERS
|
||||
import com.nisemoe.nise.osu.OsuApi
|
||||
import com.nisemoe.nise.osu.TokenService
|
||||
import com.nisemoe.nise.scheduler.GlobalCache
|
||||
import org.jooq.DSLContext
|
||||
import org.junit.jupiter.api.Assertions.*
|
||||
import org.junit.jupiter.api.Test
|
||||
import org.springframework.beans.factory.annotation.Autowired
|
||||
import org.springframework.boot.test.autoconfigure.jdbc.AutoConfigureTestDatabase
|
||||
import org.springframework.boot.test.context.SpringBootTest
|
||||
import org.springframework.boot.test.mock.mockito.MockBean
|
||||
import org.springframework.context.annotation.Import
|
||||
import org.springframework.test.context.ActiveProfiles
|
||||
import org.springframework.test.context.DynamicPropertyRegistry
|
||||
import org.springframework.test.context.DynamicPropertySource
|
||||
import org.testcontainers.containers.PostgreSQLContainer
|
||||
import org.testcontainers.junit.jupiter.Container
|
||||
import org.testcontainers.junit.jupiter.Testcontainers
|
||||
import java.time.LocalDateTime
|
||||
|
||||
@SpringBootTest
|
||||
@ActiveProfiles("postgres")
|
||||
@MockBean(GlobalCache::class, OsuApi::class, TokenService::class, UserService::class)
|
||||
@Testcontainers
|
||||
@AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE)
|
||||
@Import(ScoreService::class)
|
||||
class ScoreServiceTest {
|
||||
|
||||
companion object {
|
||||
@Container
|
||||
private val postgresContainer = PostgreSQLContainer<Nothing>("postgres:13").apply {
|
||||
withDatabaseName("testdb")
|
||||
withUsername("testuser")
|
||||
withPassword("testpass")
|
||||
start()
|
||||
}
|
||||
|
||||
@DynamicPropertySource
|
||||
@JvmStatic
|
||||
fun registerDynamicProperties(registry: DynamicPropertyRegistry) {
|
||||
registry.add("spring.datasource.url", postgresContainer::getJdbcUrl)
|
||||
registry.add("spring.datasource.username", postgresContainer::getUsername)
|
||||
registry.add("spring.datasource.password", postgresContainer::getPassword)
|
||||
}
|
||||
}
|
||||
|
||||
@Autowired
|
||||
lateinit var scoreService: ScoreService
|
||||
|
||||
@Autowired
|
||||
lateinit var dslContext: DSLContext
|
||||
|
||||
@Test
|
||||
fun `should only return replay1`() {
|
||||
//
|
||||
dslContext.insertInto(SCORES_SIMILARITY)
|
||||
.set(SCORES_SIMILARITY.BEATMAP_ID, 1)
|
||||
.set(SCORES_SIMILARITY.REPLAY_ID_1, 1)
|
||||
.set(SCORES_SIMILARITY.REPLAY_ID_2, 2)
|
||||
.set(SCORES_SIMILARITY.SIMILARITY, 0.5)
|
||||
.execute()
|
||||
|
||||
dslContext.insertInto(BEATMAPS)
|
||||
.set(BEATMAPS.BEATMAP_ID, 1)
|
||||
.set(BEATMAPS.BEATMAPSET_ID, 1)
|
||||
.set(BEATMAPS.TITLE, "test")
|
||||
.execute()
|
||||
|
||||
dslContext.insertInto(SCORES)
|
||||
.set(SCORES.REPLAY_ID, 1)
|
||||
.set(SCORES.USER_ID, 1)
|
||||
.set(SCORES.DATE, LocalDateTime.now().minusDays(90))
|
||||
.execute()
|
||||
|
||||
dslContext.insertInto(USERS)
|
||||
.set(USERS.USER_ID, 1)
|
||||
.set(USERS.USERNAME, "good boy")
|
||||
.execute()
|
||||
|
||||
dslContext.insertInto(SCORES)
|
||||
.set(SCORES.REPLAY_ID, 2)
|
||||
.set(SCORES.USER_ID, 2)
|
||||
.set(SCORES.DATE, LocalDateTime.now().minusDays(180))
|
||||
.execute()
|
||||
|
||||
dslContext.insertInto(USERS)
|
||||
.set(USERS.USER_ID, 2)
|
||||
.set(USERS.USERNAME, "naughty boy")
|
||||
.execute()
|
||||
|
||||
dslContext.insertInto(SCORES_SIMILARITY)
|
||||
.set(SCORES_SIMILARITY.BEATMAP_ID, 1)
|
||||
.set(SCORES_SIMILARITY.REPLAY_ID_1, 4)
|
||||
.set(SCORES_SIMILARITY.REPLAY_ID_2, 3)
|
||||
.set(SCORES_SIMILARITY.SIMILARITY, 0.5)
|
||||
.execute()
|
||||
|
||||
dslContext.insertInto(SCORES)
|
||||
.set(SCORES.REPLAY_ID, 3)
|
||||
.set(SCORES.USER_ID, 1)
|
||||
.set(SCORES.DATE, LocalDateTime.now().minusDays(90))
|
||||
.execute()
|
||||
|
||||
dslContext.insertInto(SCORES)
|
||||
.set(SCORES.REPLAY_ID, 4)
|
||||
.set(SCORES.USER_ID, 2)
|
||||
.set(SCORES.DATE, LocalDateTime.now().minusDays(90))
|
||||
.execute()
|
||||
|
||||
//
|
||||
val results = scoreService.getSimilarReplaysForUserId(1)
|
||||
assertNotNull(results)
|
||||
assertEquals(1, results.size)
|
||||
assertEquals("naughty boy", results.first().username_1)
|
||||
}
|
||||
|
||||
}
|
||||
23
nise-circleguard/Dockerfile
Normal file
23
nise-circleguard/Dockerfile
Normal file
@ -0,0 +1,23 @@
|
||||
FROM python:3.11.8-slim
|
||||
|
||||
ENV version=2
|
||||
ENV PYTHONPATH /app
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
RUN apt update
|
||||
|
||||
COPY requirements.txt ./requirements.txt
|
||||
|
||||
RUN pip3 install --upgrade pip && \
|
||||
pip3 install -r requirements.txt
|
||||
|
||||
# This is *reall* bad, but I'd rather get this working rather than forking packages and re-publishing them.
|
||||
# It'll probably break some day.
|
||||
RUN sed -i 's/events: List\[Event\] = Field(deserialize_type=List\[_Event\])/events: List\[_Event\] = Field(deserialize_type=List\[_Event\])/' /usr/local/lib/python3.11/site-packages/ossapi/models.py && \
|
||||
sed -i 's/self\._conn = sqlite3.connect(str(cache_path))/self._conn = sqlite3.connect(str(cache_path), check_same_thread=False)/' /usr/local/lib/python3.11/site-packages/circleguard/loader.py && \
|
||||
sed -i "64s|.*| self._db = db = sqlite3.connect(str(path / '.slider.db'), check_same_thread=False)|" /usr/local/lib/python3.11/site-packages/slider/library.py
|
||||
|
||||
COPY ./src/ ./src/
|
||||
|
||||
CMD ["python", "src/main.py"]
|
||||
10
nise-circleguard/Publish.sh
Executable file
10
nise-circleguard/Publish.sh
Executable file
@ -0,0 +1,10 @@
|
||||
#!/usr/bin/zsh
|
||||
|
||||
# Git actions
|
||||
git add .
|
||||
git commit -m "$(date +%Y-%m-%d)"
|
||||
git push origin main
|
||||
|
||||
# Docker actions
|
||||
docker build . -t git.gengo.tech/nuff/nise-circleguard:latest
|
||||
docker push git.gengo.tech/nuff/nise-circleguard:latest
|
||||
22
nise-circleguard/readme.md
Normal file
22
nise-circleguard/readme.md
Normal file
@ -0,0 +1,22 @@
|
||||
# development
|
||||
|
||||
This module has only been tested with Python 3.11
|
||||
|
||||
How to run:
|
||||
|
||||
1. create a venv and install the requirements
|
||||
|
||||
```bash
|
||||
python3 -m venv venv
|
||||
source venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
2. run the main.py file
|
||||
|
||||
```bash
|
||||
source venv/bin/activate
|
||||
python src/main.py
|
||||
```
|
||||
|
||||
Make sure to set the `OSU_API_KEY` env variable to your osu!api v1 key.
|
||||
3
nise-circleguard/requirements.txt
Normal file
3
nise-circleguard/requirements.txt
Normal file
@ -0,0 +1,3 @@
|
||||
ossapi==3.4.3
|
||||
circleguard==5.4.1
|
||||
flask==3.0.2
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user