Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nuxt client #998

Open
JL-Code opened this issue Sep 2, 2024 · 10 comments · May be fixed by #1519
Open

Nuxt client #998

JL-Code opened this issue Sep 2, 2024 · 10 comments · May be fixed by #1519
Labels
feature 🚀 New feature or request prioritized 🚚 This issue has been prioritized and will be worked on soon RSVP 👍👎 Explicit request for reactions to gauge interest in resolving this issue

Comments

@JL-Code
Copy link

JL-Code commented Sep 2, 2024

Please vote on this issue if you're interested in Nuxt client.

This issue will serve as a to-do list during development. To help us release this feature as quickly as possible, feel free to share the minimum requirements for your use case.

@JL-Code JL-Code added the feature 🚀 New feature or request label Sep 2, 2024
@mrlubos
Copy link
Member

mrlubos commented Sep 2, 2024

Hey @JL-Code, are you able to specify what is needed to enable this?

@rigtigeEmil
Copy link

Hey. I'm also quite interested in this. Nuxt has a few composables that I think @JL-Code might've referred to, useFetch and useAsyncData.
https://nuxt.com/docs/getting-started/data-fetching

@mrlubos mrlubos added the RSVP 👍👎 Explicit request for reactions to gauge interest in resolving this issue label Nov 9, 2024
@rigtigeEmil
Copy link

@mrlubos what sort of interest are you looking for before prioritizing building new clients?
(huge fan of this tool, saves so much time, so thanks a bunch 🫶)

@mrlubos
Copy link
Member

mrlubos commented Dec 27, 2024

Thank you @rigtigeEmil! Honestly I think there's enough interest in Nuxt now, I will add it to the docs so I don't forget. Can you help me understand how do you manage with the current setup? Are you using fetch? Something custom?

@rigtigeEmil
Copy link

Yeah. With the current setup, i just use fetch. Like other SSR-frameworks it's not ideal, since it can cause hydration issues, and invoked the backing API twice (once on the server, and once on the client side). Most of my app is not SSR, and for that entire part I just use @hey-api/sdk. For the part that is SSR, I have just one it old-school, and handwritten the few endpoints I need.

Nuxt's useFetch and useAsyncData solve this problem, and provide a few more nice things that might be nice to be able to configure, although I don't personally have a need for it. useFetch is just a small wrapper around useAsyncData, so in terms of (my) need for an integration, just using useFetch would be completely fine

@mrlubos mrlubos changed the title Available Clients add Nuxt.js support Nuxt client Dec 27, 2024
@mrlubos mrlubos added the prioritized 🚚 This issue has been prioritized and will be worked on soon label Dec 28, 2024
@mrlubos
Copy link
Member

mrlubos commented Dec 28, 2024

I'm looking into this @rigtigeEmil, will let you know if I have any updates. There are like 6 different APIs that need to be handled, I'll see what we can come up with

EDIT:
Related:

@mrlubos mrlubos linked a pull request Dec 31, 2024 that will close this issue
@mrlubos
Copy link
Member

mrlubos commented Dec 31, 2024

Hey all, have a look at the pull request referenced above if you're interested in progress.

@rigtigeEmil et al., how does this API look to you? https://github.com/hey-api/openapi-ts/blob/d3fccb07871fd5e6d8977a45760c2c0fe0466ae3/examples/openapi-ts-nuxt/components/home.vue

I've included native Nuxt methods for comparison vs the generated client. It will look very similar to the current clients, with a few caveats:

  • we need to return the same response Nuxt APIs return, i.e. we cannot await the result inside and do fancy stuff with it
  • you'll need to specify the Nuxt API you want to use in SDKs/clients

That's what I've got so far. Here's how it would look

const data = await getPetById({
  composable: '$fetch',
  path: {
    petId: 8,
  },
});

I tried doing away with the composable field, but we need it. Otherwise, you'd need to define data and error types on each call manually and that would defeat the whole purpose of generating SDKs.

The available composable values are:

  • $fetch
  • useAsyncData
  • useFetch
  • useLazyAsyncData
  • useLazyFetch

So far, this is how each API is implemented in the client. It's far from finished, but it should give you an idea

const clientRequest: Client['clientRequest'] = ({
asyncDataOptions,
composable,
fetchOptions,
key,
requestFetch,
url,
...opts
}) => {
const fetchFn = requestFetch ?? $fetch;
const baseUrl = 'https://petstore3.swagger.io/api/v3';
const finalUrl = `${baseUrl}${url}`;
if (composable === '$fetch') {
return fetchFn(finalUrl, opts);
}
if (composable === 'useFetch') {
return useFetch(finalUrl, {
...opts,
...fetchOptions,
});
}
if (composable === 'useLazyFetch') {
return useLazyFetch(finalUrl, {
...opts,
...fetchOptions,
});
}
const handler: (ctx?: NuxtApp) => Promise<any> = () =>
fetchFn(finalUrl, opts);
if (composable === 'useAsyncData') {
return key
? useAsyncData(key, handler, asyncDataOptions)
: useAsyncData(handler, asyncDataOptions);
}
if (composable === 'useLazyAsyncData') {
return key
? useLazyAsyncData(key, handler, asyncDataOptions)
: useLazyAsyncData(handler, asyncDataOptions);
}
return undefined as any;
};

You'll be able to pass the key argument if necessary, or configure composable APIs through fetchOptions and asyncDataOptions fields respectively. Regular $fetch will be configurable through the main call, just like the existing clients. Lastly, you can pass requestFetch if needed and that will be used instead of $fetch.

Any thoughts so far?

@rigtigeEmil
Copy link

rigtigeEmil commented Dec 31, 2024

With this implementation, would we still have access to some of the goodies the various composables provide?

I'm not sure how familiar you are with Nuxt, but these composables react automatically to reactive variables, meaning I can provide reactive variables to useFetch (or the other composables), and whenever these variables change, the endpoint is invoked automatically:

const message = ref('foo');
const { data } = await useFetch('/echo', {
	method: 'POST',
	body: message,
});
// data will be `foo`

// useFetch will automatically re-trigger after this assignment since the ref has changed
message.value = 'bar';
// data will now be `bar`

It seems like we're returning the useFetch composable directly, which is generally not something you'd want to do in a typical Nuxt component - you generally want to setup the composable once, and the either

  1. rely on the reactive variables changing to make new calls
  2. explicitly set immediate and trigger the invocation manually from the returned execute function
    I'm not sure this is a problem though, when used through a library, if we're limited on these functionalities

Would it be possible to specify a default in settings? For instance, I would probably want to use useFetch in most cases, simply to avoid duplicate calls on server/client, and just be able to use some of the other composables (where a key is almost mandatory)

edit: there's a nice video going over some of the caveats here, if you're interested and haven't seen it already:
https://www.youtube.com/watch?v=njsGVmcWviY

@mrlubos
Copy link
Member

mrlubos commented Dec 31, 2024

@rigtigeEmil

With this implementation, would we still have access to some of the goodies the various composables provide?

That's the goal! Otherwise, you could use the good old Fetch or Axios client. If this goal isn't achieved, it won't make sense to release and maintain another client.

It seems like we're returning the useFetch composable directly, which is generally not something you'd want to do in a typical Nuxt component.

What would you return instead? If we look at your example...

const message = ref('foo');
const { data } = await useFetch('/echo', {
	method: 'POST',
	body: message,
});
// data will be `foo`

// useFetch will automatically re-trigger after this assignment since the ref has changed
message.value = 'bar';
// data will now be `bar`

...this is how it would change with the Nuxt client.

const message = ref('foo');
const { data } = await postEcho({
	body: message,
	composable: 'useFetch',
});
// data will be `foo`

// postEcho will automatically re-trigger after this assignment since the ref has changed
message.value = 'bar';
// data will now be `bar`

Can you describe why this isn't recommended? In my view, this would give you the most control over the Nuxt response. Back to your points:

you generally want to setup the composable once

This has been accomplished by calling the client function.

rely on the reactive variables changing to make new calls

This will continue working, no need to use the awaited result.

explicitly set immediate and trigger the invocation manually from the returned execute function

This can be done by passing an option to the call and using the awaited result. Totally up to you though.

Would it be possible to specify a default in settings? For instance, I would probably want to use useFetch in most cases, simply to avoid duplicate calls on server/client, and just be able to use some of the other composables (where a key is almost mandatory)

I thought about it, but didn't go there with this first iteration, so we can focus on the underlying mechanics. In theory, nothing against specifying the default, though I am worried that hiding that detail is not preferable over manually specifying composable. The reason is the function name does not convey which API it uses and could be accidentally misused. Of course, you can make the reverse argument too: if you specify the default, you should be familiar with the codebase and generated client enough to know what it's calling under the hood.

What do you think?

@rigtigeEmil
Copy link

Ah, I think I've been looking at it wrong; I was expecting the API to look like it does with the fetch client, but this will actually mirror the Nuxt API's of the different composables, which is so much better!

I think my example was a bit flawed, but it's also completely irrelevant with what I just wrote above, what I actually meant was, that you generally only want to call the useFetch (and other API's) composable directly in script setup, middleware etc., and not inside of a function that gets invoked whenever data changes. A more accurate example of what you don't want is here. Any time password/username changes, the api would be re-invoked:

<template>
	<form @submit="onSubmit">
		<input v-model="username">
		<input v-model="password">
	</form>
</template>

<script lang="ts" setup>
const username = ref('');
const password = ref('');

const body = computed(() => ({
	username: username.value,
	password: password.value,
}));

async function onSubmit() {
	const { data, error } = await useFetch('sign-in', {
		body,
		method: 'POST',
	});
}
</script>

This would be completely fine, when using the current fetch client, since there's no reactivity based on Vue's ref/reactive. So today, with the fetch client, similar onSubmit code would something like this:

async function onSubmit() {
	const data = await postSignIn(body.value);
}

Hopefully I've explained myself a bit better now, but again, I think I just completely misunderstood the direction you're going in, making this completely obsolete.

For the last point, I personally would prefer to have the possibility of supplying an overridable default, mostly since I have a tendency to use useFetch/$fetch in most scenarios, but this is just to remove a tiny bit of duplication. I agree completely with your point regarding this; I just think being agnostic to this point might be beneficial

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature 🚀 New feature or request prioritized 🚚 This issue has been prioritized and will be worked on soon RSVP 👍👎 Explicit request for reactions to gauge interest in resolving this issue
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants