Skip to content

Commit 563d008

Browse files
committed
fix: fixed all documentation
1 parent 3b83491 commit 563d008

File tree

3 files changed

+132
-578
lines changed

3 files changed

+132
-578
lines changed

services/markdownify.mdx

+34-139
Original file line numberDiff line numberDiff line change
@@ -33,23 +33,18 @@ response = client.markdownify(
3333
```
3434

3535
```javascript JavaScript
36-
import { Client } from 'scrapegraph-js';
36+
import { markdownify } from 'scrapegraph-js';
3737

38-
// Initialize the client
39-
const sgai_client = new Client("your-api-key");
38+
const apiKey = 'your-api-key';
39+
const url = 'https://scrapegraphai.com/';
4040

4141
try {
42-
const response = await sgai_client.markdownify({
43-
websiteUrl: "https://example.com"
44-
});
45-
46-
console.log('Request ID:', response.requestId);
47-
console.log('Result:', response.result);
42+
const response = await markdownify(apiKey, url);
43+
console.log(response);
4844
} catch (error) {
4945
console.error(error);
50-
} finally {
51-
sgai_client.close();
5246
}
47+
5348
```
5449

5550
```bash cURL
@@ -65,6 +60,13 @@ curl -X 'POST' \
6560

6661
</CodeGroup>
6762

63+
#### Parameters
64+
65+
| Parameter | Type | Required | Description |
66+
|-----------|------|----------|-------------|
67+
| apiKey | string | Yes | The ScrapeGraph API Key. |
68+
| websiteUrl | string | Yes | The URL of the webpage to convert to markdown. |
69+
6870
<Note>
6971
Get your API key from the [dashboard](https://dashboard.scrapegraphai.com)
7072
</Note>
@@ -128,152 +130,45 @@ The response includes:
128130
Want to learn more about our AI-powered scraping technology? Visit our [main website](https://scrapegraphai.com) to discover how we're revolutionizing web data extraction.
129131
</Note>
130132

131-
## Advanced Usage
132-
133-
### Request Helper Function
134-
135-
The `getMarkdownifyRequest` function helps create properly formatted request objects for the Markdownify service:
136-
137-
```javascript
138-
import { getMarkdownifyRequest } from 'scrapegraph-js';
133+
## Other Functionality
139134

140-
const request = getMarkdownifyRequest({
141-
websiteUrl: "https://example.com/article",
142-
headers: {
143-
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36"
144-
}
145-
});
146-
```
135+
### Retrieve a previous request
147136

148-
#### Parameters
137+
If you know the response id of a previous request you made, you can retrieve all the information.
149138

150-
| Parameter | Type | Required | Description |
151-
|-----------|------|----------|-------------|
152-
| websiteUrl | string | Yes | The URL of the webpage to convert to markdown. |
153-
| headers | object | No | Custom headers for the request (e.g., User-Agent, cookies). |
154-
155-
#### Return Value
156-
157-
Returns an object with the following structure:
139+
<CodeGroup>
158140

159-
```typescript
160-
{
161-
request_id: string;
162-
status: "queued" | "processing" | "completed" | "failed";
163-
website_url: string;
164-
result?: string | null;
165-
error: string;
166-
}
141+
```python Python
142+
// TODO
167143
```
168144

169-
#### Error Handling
145+
```javascript JavaScript
146+
import { getMarkdownifyRequest } from 'scrapegraph-js';
170147

171-
The function includes built-in error handling for common scenarios:
148+
const apiKey = 'your_api_key';
149+
const requestId = 'ID_of_previous_request';
172150

173-
```javascript
174151
try {
175-
const request = getMarkdownifyRequest({
176-
websiteUrl: "https://example.com/article"
177-
});
152+
const requestInfo = await getMarkdownifyRequest(apiKey, requestId);
153+
console.log(requestInfo);
178154
} catch (error) {
179-
if (error.code === 'INVALID_URL') {
180-
console.error('The provided URL is not valid');
181-
} else if (error.code === 'MISSING_REQUIRED') {
182-
console.error('Required parameters are missing');
183-
} else {
184-
console.error('An unexpected error occurred:', error);
185-
}
155+
console.error(error);
186156
}
187157
```
188158

189-
#### Advanced Examples
190-
191-
##### Using Custom Headers
192-
193-
```javascript
194-
const request = getMarkdownifyRequest({
195-
websiteUrl: "https://example.com/article",
196-
headers: {
197-
"User-Agent": "Custom User Agent",
198-
"Accept-Language": "en-US,en;q=0.9",
199-
"Cookie": "session=abc123; user=john",
200-
"Authorization": "Bearer your-auth-token"
201-
}
202-
});
203-
```
204-
205-
##### Handling Dynamic Content
206-
207-
For websites with dynamic content, you might need to adjust the request:
208-
209-
```javascript
210-
const request = getMarkdownifyRequest({
211-
websiteUrl: "https://example.com/dynamic-content",
212-
headers: {
213-
// Headers to handle dynamic content
214-
"X-Requested-With": "XMLHttpRequest",
215-
"Accept": "text/html,application/xhtml+xml,application/xml;q=0.9",
216-
// Add any required session cookies
217-
"Cookie": "dynamicContent=enabled; sessionId=xyz789"
218-
}
219-
});
220-
```
221-
222-
#### Best Practices
223-
224-
1. **URL Validation**
225-
- Always validate URLs before making requests
226-
- Ensure URLs use HTTPS when possible
227-
- Handle URL encoding properly
228-
229-
```javascript
230-
import { isValidUrl } from 'scrapegraph-js/utils';
231-
232-
const url = "https://example.com/article with spaces";
233-
const encodedUrl = encodeURI(url);
234-
235-
if (isValidUrl(encodedUrl)) {
236-
const request = getMarkdownifyRequest({ websiteUrl: encodedUrl });
237-
}
159+
```bash cURL
160+
// TODO
238161
```
239162

240-
2. **Header Management**
241-
- Use appropriate User-Agent strings
242-
- Include necessary cookies for authenticated content
243-
- Set proper Accept headers
244-
245-
3. **Error Recovery**
246-
- Implement retry logic for transient failures
247-
- Cache successful responses when appropriate
248-
- Log errors for debugging
249-
250-
```javascript
251-
import { getMarkdownifyRequest, retry } from 'scrapegraph-js';
252-
253-
const makeRequest = retry(async () => {
254-
const request = await getMarkdownifyRequest({
255-
websiteUrl: "https://example.com/article"
256-
});
257-
return request;
258-
}, {
259-
retries: 3,
260-
backoff: true
261-
});
262-
```
163+
</CodeGroup>
263164

264-
4. **Performance Optimization**
265-
- Batch requests when possible
266-
- Use caching strategies
267-
- Monitor API usage
165+
#### Parameters
268166

269-
```javascript
270-
import { cache } from 'scrapegraph-js/utils';
167+
| Parameter | Type | Required | Description |
168+
|-----------|------|----------|-------------|
169+
| apiKey | string | Yes | The ScrapeGraph API Key. |
170+
| requestId | string | Yes | The request ID associated with the output of a previous searchScraper request. |
271171

272-
const cachedRequest = cache(getMarkdownifyRequest, {
273-
ttl: 3600, // Cache for 1 hour
274-
maxSize: 100 // Cache up to 100 requests
275-
});
276-
```
277172

278173
### Async Support
279174

0 commit comments

Comments
 (0)