Skip to content

Commit

Permalink
Merge pull request #594 from 47degrees/map2eval-override
Browse files Browse the repository at this point in the history
Add map2Eval override
  • Loading branch information
sloshy authored Feb 3, 2022
2 parents 2bb545b + 4f72c19 commit 923f8bf
Show file tree
Hide file tree
Showing 5 changed files with 47 additions and 68 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Fetch

[![Join the chat at https://gitter.im/47deg/fetch](https://badges.gitter.im/47deg/fetch.svg)](https://gitter.im/47deg/fetch?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) [![Maven Central](https://img.shields.io/badge/maven%20central-1.2.1-green.svg)](https://oss.sonatype.org/#nexus-search;gav~com.47deg~fetch*) [![License](https://img.shields.io/badge/license-Apache%202-blue.svg)](https://raw.githubusercontent.com/47deg/fetch/master/LICENSE) [![Latest version](https://img.shields.io/badge/fetch-1.2.1-green.svg)](https://index.scala-lang.org/47deg/fetch) [![Scala.js](http://scala-js.org/assets/badges/scalajs-0.6.15.svg)](http://scala-js.org) [![GitHub Issues](https://img.shields.io/github/issues/47deg/fetch.svg)](https://github.com/47deg/fetch/issues)
[![Join the chat at https://gitter.im/47deg/fetch](https://badges.gitter.im/47deg/fetch.svg)](https://gitter.im/47deg/fetch?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) [![Maven Central](https://img.shields.io/badge/maven%20central-3.1.0-green.svg)](https://oss.sonatype.org/#nexus-search;gav~com.47deg~fetch*) [![License](https://img.shields.io/badge/license-Apache%202-blue.svg)](https://raw.githubusercontent.com/47deg/fetch/master/LICENSE) [![Latest version](https://img.shields.io/badge/fetch-3.1.0-green.svg)](https://index.scala-lang.org/47deg/fetch) [![Scala.js](https://www.scala-js.org/assets/badges/scalajs-1.8.0.svg)](http://scala-js.org) [![GitHub Issues](https://img.shields.io/github/issues/47deg/fetch.svg)](https://github.com/47deg/fetch/issues)

A library for Simple & Efficient data access in Scala and Scala.js

Expand All @@ -24,13 +24,13 @@ Add the following dependency to your project's build file.
For Scala 2.12.x through 3.x:

```scala
"com.47deg" %% "fetch" % "3.0.0"
"com.47deg" %% "fetch" % "3.1.0"
```

Or, if using Scala.js (1.8.x):

```scala
"com.47deg" %%% "fetch" % "3.0.0"
"com.47deg" %%% "fetch" % "3.1.0"
```


Expand Down
25 changes: 9 additions & 16 deletions fetch/src/main/scala/fetch.scala
Original file line number Diff line number Diff line change
Expand Up @@ -303,7 +303,11 @@ object `package` {
}
} yield result)

override def product[A, B](fa: Fetch[F, A], fb: Fetch[F, B]): Fetch[F, (A, B)] = {
override def map2Eval[A, B, Z](fa: Fetch[F, A], fb: Eval[Fetch[F, B]])(
f: (A, B) => Z
): Eval[Fetch[F, Z]] = Eval.now(map2(fa, fb.value)(f))

override def product[A, B](fa: Fetch[F, A], fb: Fetch[F, B]): Fetch[F, (A, B)] =
Unfetch[F, (A, B)](for {
fab <- (fa.run, fb.run).tupled
result = fab match {
Expand All @@ -321,7 +325,6 @@ object `package` {
Throw[F, (A, B)](e)
}
} yield result)
}

override def productR[A, B](fa: Fetch[F, A])(fb: Fetch[F, B]): Fetch[F, B] =
Unfetch[F, B](for {
Expand Down Expand Up @@ -365,21 +368,11 @@ object `package` {
* Given a number of fetches, returns all of the results in a `List`. In the event that multiple
* fetches are made to the same data source, this will attempt to batch them together.
*
* This should be used in code that previously relied on the auto-batching behavior of calling
* `traverse` on lists of `Fetch` values.
* As of 3.1.x, this is functionally equivalent to using `.sequence` syntax from Cats on any
* data structure implementing `Traverse`.
*/
def batchAll[F[_]: Monad, A](fetches: Fetch[F, A]*): Fetch[F, List[A]] = {
fetches.toList.toNel
.map { nes =>
nes
.map(_.map(Chain.one(_)))
.reduceLeft { (fa, fb) =>
fetchM[F].map2(fa, fb)((a, b) => a ++ b)
}
.map(_.toList)
}
.getOrElse(Fetch.pure[F, List[A]](List.empty))
}
def batchAll[F[_]: Monad, A](fetches: Fetch[F, A]*): Fetch[F, List[A]] =
fetches.toList.sequence

def exception[F[_]: Applicative, A](e: Log => FetchException): Fetch[F, A] =
Unfetch(Applicative[F].pure(Throw[F, A](e)))
Expand Down
4 changes: 2 additions & 2 deletions fetch/src/test/scala/FetchReportingTests.scala
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ class FetchReportingTests extends FetchSpec {
}.unsafeToFuture()
}

"Single fetches combined with traverse are NOT run in one round" in {
"Single fetches combined with traverse are run in one round" in {
def fetch[F[_]: Concurrent] =
for {
manies <- many(3) // round 1
Expand All @@ -94,7 +94,7 @@ class FetchReportingTests extends FetchSpec {
val io = Fetch.runLog[IO](fetch)

io.map { case (log, result) =>
log.rounds.size shouldEqual 4
log.rounds.size shouldEqual 2
}.unsafeToFuture()
}

Expand Down
32 changes: 16 additions & 16 deletions fetch/src/test/scala/FetchTests.scala
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,7 @@ class FetchTests extends FetchSpec {
}.unsafeToFuture()
}

"Traversals are NOT implicitly batched" in {
"Traversals are implicitly batched" in {
def fetch[F[_]: Concurrent]: Fetch[F, List[Int]] =
for {
manies <- many(3)
Expand All @@ -166,21 +166,21 @@ class FetchTests extends FetchSpec {

io.map { case (log, result) =>
result shouldEqual List(0, 1, 2)
log.rounds.size shouldEqual 4
log.rounds.size shouldEqual 2
}.unsafeToFuture()
}

"Sequencing is NOT implicitly batched" in {
"Sequencing is implicitly batched" in {
def fetch[F[_]: Concurrent]: Fetch[F, List[Int]] =
List(one(1), one(2), one(3)).sequence

val io = Fetch.runLog[IO](fetch)

io.map { case (log, result) =>
result shouldEqual List(1, 2, 3)
log.rounds.size shouldEqual 3
log.rounds.size shouldEqual 1
totalFetched(log.rounds) shouldEqual 3
totalBatches(log.rounds) shouldEqual 0
totalBatches(log.rounds) shouldEqual 1
}.unsafeToFuture()
}

Expand Down Expand Up @@ -360,16 +360,16 @@ class FetchTests extends FetchSpec {
}.unsafeToFuture()
}

"Sequenced fetches are NOT run concurrently" in {
"Sequenced fetches are run concurrently" in {
def fetch[F[_]: Concurrent]: Fetch[F, List[Int]] =
List(one(1), one(2), one(3), anotherOne(4), anotherOne(5)).sequence

val io = Fetch.runLog[IO](fetch)

io.map { case (log, result) =>
result shouldEqual List(1, 2, 3, 4, 5)
log.rounds.size shouldEqual 5
totalBatches(log.rounds) shouldEqual 0
log.rounds.size shouldEqual 1
totalBatches(log.rounds) shouldEqual 2
}.unsafeToFuture()
}

Expand All @@ -386,16 +386,16 @@ class FetchTests extends FetchSpec {
}.unsafeToFuture()
}

"Sequenced fetches are NOT deduped" in {
"Sequenced fetches are deduped" in {
def fetch[F[_]: Concurrent]: Fetch[F, List[Int]] =
List(one(1), one(2), one(1)).sequence

val io = Fetch.runLog[IO](fetch)

io.map { case (log, result) =>
result shouldEqual List(1, 2, 1)
log.rounds.size shouldEqual 2
totalBatches(log.rounds) shouldEqual 0
log.rounds.size shouldEqual 1
totalBatches(log.rounds) shouldEqual 1
totalFetched(log.rounds) shouldEqual 2
}.unsafeToFuture()
}
Expand All @@ -414,16 +414,16 @@ class FetchTests extends FetchSpec {
}.unsafeToFuture()
}

"Traversals are NOT batched" in {
"Traversals are batched" in {
def fetch[F[_]: Concurrent]: Fetch[F, List[Int]] =
List(1, 2, 3).traverse(one[F])

val io = Fetch.runLog[IO](fetch)

io.map { case (log, result) =>
result shouldEqual List(1, 2, 3)
log.rounds.size shouldEqual 3
totalBatches(log.rounds) shouldEqual 0
log.rounds.size shouldEqual 1
totalBatches(log.rounds) shouldEqual 1
}.unsafeToFuture()
}

Expand All @@ -440,7 +440,7 @@ class FetchTests extends FetchSpec {
}.unsafeToFuture()
}

"Sources that can be fetched concurrently inside a for comprehension will NOT automatically be concurrent" in {
"Sources that can be fetched concurrently inside a for comprehension will automatically be concurrent" in {
def fetch[F[_]: Concurrent] =
for {
v <- Fetch.pure[F, List[Int]](List(1, 2, 1))
Expand All @@ -451,7 +451,7 @@ class FetchTests extends FetchSpec {

io.map { case (log, result) =>
result shouldEqual List(1, 2, 1)
log.rounds.size shouldEqual 2
log.rounds.size shouldEqual 1
totalFetched(log.rounds) shouldEqual 2
}.unsafeToFuture()
}
Expand Down
48 changes: 17 additions & 31 deletions microsite/docs/docs.md
Original file line number Diff line number Diff line change
Expand Up @@ -458,37 +458,20 @@ The above example combines data from two different sources, and the library know
Fetch.run[IO](fetchConcurrent).unsafeRunTimed(5.seconds)
```

## BatchAll
## Auto-batching

In versions of Fetch before 3.0.0, calls to `sequence` or `traverse` on sequences of fetches would automatically try to batch or run fetches concurrently where possible.
This is no longer the case as of version 3.0.0, which is changed to explicitly batch requests with a new combinator.
Fetch supports automatically batching multiple fetch requests in sequence using various combinators.
This means that if you make multiple requests at once using combinators from Cats such as `.sequence` or `.traverse` you will get your requests as fast as possible, every time.

Consider the following code:
In Fetch 2.x and 3.1.x, calls to `sequence` or `traverse` on sequences of fetches will automatically try to batch or run fetches concurrently where possible.
However, in Fetch 3.0.0, we briefly went in the direction of not guaranteeing batches on sequences and introducing explicit batching support to work around this.
In hindsight, we felt that those ideas would be best explored in other projects and have decided to revert behavior to the way it was in 2.x, but keeping the new syntax added so as to not break projects.

```scala mdoc:silent
def listOfFetches[F[_]: Console: Temporal]: List[Fetch[F, Post]] = List(1, 2, 3).map(getPost[F])
val sequencedList: Fetch[IO, List[Post]] = listOfFetches[IO].sequence
```

In versions <3.0.0, you could call `.sequence` on this list and it would make a single request for IDs 1, 2, and 3 if possible in a batch.
In versions >=3.0.0, calling `.sequence` is essentially the same as this code:

```scala mdoc:silent
val sequencedListEquivalent = getPost[IO](1).flatMap { postOne =>
getPost[IO](2).flatMap { postTwo =>
getPost[IO](3).map { postThree =>
List(postOne, postTwo, postThree)
}
}
}
```

As mentioned above, you can use `.flatMap` to sequence fetches, but not batch them.
Therefore we have added a new combinator that guarantees batching and we do not support batching via `.sequence` or `.traverse` anymore.
Here is an example showing how to batch fetches using `Fetch.batchAll`

```scala mdoc:silent
val batchedList: Fetch[IO, List[Post]] = Fetch.batchAll(listOfFetches[IO]: _*)
val listOfFetches = List(1, 2, 3).map(getPost[IO])
val batchedList: Fetch[IO, List[Post]] = Fetch.batchAll(listOfFetches: _*)
```

You can also use helpful syntax by importing `fetch.syntax._` for batching sequences, like so:
Expand All @@ -497,12 +480,15 @@ You can also use helpful syntax by importing `fetch.syntax._` for batching seque
import fetch.syntax._

//Takes a sequence of fetches and batches them
val batchedListWithSyntax = listOfFetches[IO].batchAll
val batchedListWithSyntax = listOfFetches.batchAll

//Allows you to supply your own function to batch a sequence as fetches
val listToBatchWithSyntax = List(1, 2, 3).batchAllWith(id => getPost[IO](id))
```

Underneath, `.batchAll` and its siblings are synonymous with the methods from Cats named `.sequence` or `.traverse`, but converting the final result to a `List` explicitly afterward.
If you currently use `.sequence` or `.traverse`, you will automatically batch a sequence of fetches as always, and `.batchAll` is there for less cats-heavy usage.

# Caching

As we have learned, Fetch caches intermediate results implicitly. You can
Expand Down Expand Up @@ -531,7 +517,7 @@ in the cache.

```scala mdoc:silent
def fetchManyUsers[F[_]: Console: Temporal]: Fetch[F, List[User]] =
List(1, 2, 3).batchAllWith(getUser[F])
List(1, 2, 3).traverse(getUser[F])
```

If only part of the data is cached, the cached data won't be asked for:
Expand Down Expand Up @@ -631,7 +617,7 @@ than two users:

```scala mdoc
def fetchManyBatchedUsers[F[_]: Console: Temporal]: Fetch[F, List[User]] =
List(1, 2, 3, 4).batchAllWith(getBatchedUser[F])
List(1, 2, 3, 4).traverse(getBatchedUser[F])

Fetch.run[IO](fetchManyBatchedUsers).unsafeRunTimed(5.seconds)
```
Expand Down Expand Up @@ -669,7 +655,7 @@ We have defined the maximum batch size to be 2 and the batch execution to be seq

```scala mdoc
def fetchManySeqBatchedUsers[F[_]: Console: Temporal]: Fetch[F, List[User]] =
List(1, 2, 3, 4).batchAllWith(getSequentialUser[F])
List(1, 2, 3, 4).traverse(getSequentialUser[F])

Fetch.run[IO](fetchManySeqBatchedUsers).unsafeRunTimed(5.seconds)
```
Expand Down Expand Up @@ -886,7 +872,7 @@ visualize fetch executions using the execution log.

```scala mdoc:silent
def batched[F[_]: Console: Temporal]: Fetch[F, List[User]] =
List(1, 2).batchAllWith(getUser[F])
List(1, 2).traverse(getUser[F])

def cached[F[_]: Console: Temporal]: Fetch[F, User] =
getUser(2)
Expand All @@ -895,7 +881,7 @@ def notCached[F[_]: Console: Temporal]: Fetch[F, User] =
getUser(4)

def concurrent[F[_]: Console: Temporal]: Fetch[F, (List[User], List[Post])] =
(List(1, 2, 3).batchAllWith(getUser[F]), List(1, 2, 3).batchAllWith(getPost[F])).tupled
(List(1, 2, 3).traverse(getUser[F]), List(1, 2, 3).traverse(getPost[F])).tupled

def interestingFetch[F[_]: Console: Temporal]: Fetch[F, String] =
batched >> cached >> notCached >> concurrent >> Fetch.pure("done")
Expand Down

0 comments on commit 923f8bf

Please sign in to comment.