Premieres sometime in 2025. Check out the ANN article for additional information. Synopsis of the source manga from AniList:

She will captivate anyone she sucks blood from!

Runa Ishikawa is a vampire. The cool and mysterious beauty is the most popular girl in class! But it turns out she’s not very good at sucking blood? A new comedy about the pampering and feeding of a vampire!

    • @[email protected]
      link
      fedilink
      English
      74 months ago

      It would have cost you nothing to not say this. What a horrible day to be able to read…

  • asudox
    link
    fedilink
    English
    64 months ago

    She might not be able to suck blood, but definitely something else ( ͡° ͜ʖ ͡°)

        • Elevator7009
          link
          fedilink
          English
          4
          edit-2
          4 months ago

          I do think a lot of people online tend to forget teenagers use the internet when people express attraction. I distinctly remember reading a comment on Reddit about how the commenter expressed attraction to Joja Siwa online as a child, and got called a pedophile, and it made their same-age self worry that their attraction was pedophilia. So thanks for remembering.

        • @[email protected]
          link
          fedilink
          English
          24 months ago

          Name…absolutely does not check out. Saliently enough, have you managed to try DeepSeek, or even get it set up locally?

          • @[email protected]
            link
            fedilink
            English
            34 months ago

            Name…absolutely does not check out.

            Uhh, oh, fair enough (゚∀゚)

            Saliently enough, have you managed to try DeepSeek, or even get it set up locally?

            Yeah, I’ve successfully run the cut down version of deepseek-r1 through Ollama. The model itself is the 7b (I’m VRAM limited to 8GB). I used an M1 Mac Mini to run it, in terms of performance, is fast and the quality of the generated content is okay.

            Depending on your hardware and SO, you will or not be able to get to run a LLM locally with reasonable speed. You might want to check the GPU support for Ollama. You don’t need a GPU as it can run on the CPU, but it’ll certainly be slower.

            • @[email protected]
              link
              fedilink
              English
              24 months ago

              I have a very beefy PC, so I don’t think VRAM or any hardware will really be the limitation, thankfully. Thanks for the links!