hugsbrugs / php-http
PHP HTTP Utilities
Requires
- php: >=7.4
- guzzlehttp/guzzle: ^7.0
- jeremykendall/php-domain-parser: ^6.1.1
- psr/simple-cache: ^3.0
- symfony/cache: ^6.0
- true/punycode: ~2.0
Requires (Dev)
- phpunit/phpunit: ^9
This package is auto-updated.
Last update: 2024-12-19 22:48:13 UTC
README
This library provides PHP utilities functions to manage URLs. Read PHP DOC
Install
Install package with composer
composer require hugsbrugs/php-http
In your PHP code, load librairy
require_once __DIR__ . '/../vendor/autoload.php'; use Hug\Http\Http as Http;
Configuration
In order to use cache mechanism, define following constants
define('PDP_PDO_DSN', 'mysql:host=localhost;dbname=database'); define('PDP_PDO_USER', 'username'); define('PDP_PDO_PASS', 'password'); define('PDP_PDO_OPTIONS', [PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION]);
Alternatively define path to the local stored public suffix list
define('PUBLIC_SUFFIX_LIST', realpath(__DIR__ . '/../../../cache/public_suffix_list.dat'));
This method should not be used in production since it's really slow.
Otherwise the default, not accurate, cache/public_suffix_list.dat file will be used.
Usage
Execute shell nslookup command
Http::nslookup($url);
Check if an url is accessible (means not a 404)
Http::is_url_accessible($url);
Returns HTTP code for given URL
Http::get_http_code($url);
Cleans an url from its query parameters
Http::url_remove_query($url);
Cleans an url from its query parameters and path
Http::url_remove_query_and_path($url);
Quick and dirty function to save an image from the internet
Http::grab_image($url, $save_to);
Returns basic HTTP headers for a CURL request
Http::get_default_headers($host);
Extracts suffix, tld, domain and subdomain from an URL
Http::extract_all_from_url($url);
Extracts extention from an URL
Http::extract_extension_from_url($url);
Extracts scheme (ftp, http) from an URL
Http::extract_scheme_from_url($url);
Extracts a TLD (Top Level Domain) from an URL
Http::extract_tld_from_url($url);
Extracts a sub domain from an URL
Http::extract_subdomain_from_url($url);
Extracts a domain name from an URL
Http::extract_domain_from_url($url);
Separates Headers from Body in CURL response
Http::extract_request_headers_body($html_with_headers);
Sets a php script desired status code (usefull for API)
Http::header_status($statusCode);
Gets the address and/or http code that the provided URL redirects to. $return can be : url/code/all
Http::get_redirect_url($url, $timeout = 5, $return = 'url');
Follows and collects all redirects, in order, for the given URL.
Http::get_all_redirects($url);
Gets the address and/or http code that the URL ultimately leads to. $return can be : url/code/all
Http::get_final_url($url, $return = 'url');
Check a TXT record in domain zone file
Http::check_txt_record($domain, $txt);
Waits and tests every minute if domain zone has correct IP adress and TXT record set
Http::wait_for_zone_ok($domain, $ip, $txt_record, $wait_minutes = 15);
Tests if domain zone has correct IP adress and TXT record set
Http::is_zone_ok($domain, $ip, $txt_record);
Get name servers of given domain
Http::get_name_servers('maugey.fr');
Add escaped fragment to URL
Http::add_escaped_fragment($url);
To enable CORS, put this line at top of your PHP script
Http::cors();
Converts an URL to a filename It does not encode URL parameters (only scheme - domain - folders - file)
Http::url_2_filename($url);
Dependecies
https://github.com/jeremykendall/php-domain-parser https://github.com/jeremykendall/php-domain-parser/tree/5.7.0 https://publicsuffix.org/list/public_suffix_list.dat
Unit Tests
composer exec phpunit
phpunit --configuration phpunit.xml
Author
Hugo Maugey Webmaster | Consultant SEO | Fullstack developer