Deprecated: Return type of Dotenv\Environment\AbstractVariables::offsetExists($offset) should either be compatible with ArrayAccess::offsetExists(mixed $offset): bool, or the #[\ReturnTypeWillChange] attribute should be used to temporarily suppress the notice in /home3/thehackw/public_html/wp-content/plugins/wb4wp-wordpress-plugin-bluehost-india-production/vendor/vlucas/phpdotenv/src/Environment/AbstractVariables.php on line 162

Deprecated: Return type of Dotenv\Environment\AbstractVariables::offsetGet($offset) should either be compatible with ArrayAccess::offsetGet(mixed $offset): mixed, or the #[\ReturnTypeWillChange] attribute should be used to temporarily suppress the notice in /home3/thehackw/public_html/wp-content/plugins/wb4wp-wordpress-plugin-bluehost-india-production/vendor/vlucas/phpdotenv/src/Environment/AbstractVariables.php on line 170

Deprecated: Return type of Dotenv\Environment\AbstractVariables::offsetSet($offset, $value) should either be compatible with ArrayAccess::offsetSet(mixed $offset, mixed $value): void, or the #[\ReturnTypeWillChange] attribute should be used to temporarily suppress the notice in /home3/thehackw/public_html/wp-content/plugins/wb4wp-wordpress-plugin-bluehost-india-production/vendor/vlucas/phpdotenv/src/Environment/AbstractVariables.php on line 178

Deprecated: Return type of Dotenv\Environment\AbstractVariables::offsetUnset($offset) should either be compatible with ArrayAccess::offsetUnset(mixed $offset): void, or the #[\ReturnTypeWillChange] attribute should be used to temporarily suppress the notice in /home3/thehackw/public_html/wp-content/plugins/wb4wp-wordpress-plugin-bluehost-india-production/vendor/vlucas/phpdotenv/src/Environment/AbstractVariables.php on line 186

Deprecated: Return type of PhpOption\Some::getIterator() should either be compatible with IteratorAggregate::getIterator(): Traversable, or the #[\ReturnTypeWillChange] attribute should be used to temporarily suppress the notice in /home3/thehackw/public_html/wp-content/plugins/wb4wp-wordpress-plugin-bluehost-india-production/vendor/phpoption/phpoption/src/PhpOption/Some.php on line 152

Deprecated: Return type of PhpOption\None::getIterator() should either be compatible with IteratorAggregate::getIterator(): Traversable, or the #[\ReturnTypeWillChange] attribute should be used to temporarily suppress the notice in /home3/thehackw/public_html/wp-content/plugins/wb4wp-wordpress-plugin-bluehost-india-production/vendor/phpoption/phpoption/src/PhpOption/None.php on line 118

Deprecated: Creation of dynamic property WPForms\WPForms::$form is deprecated in /home3/thehackw/public_html/wp-content/plugins/wpforms-lite/src/WPForms.php on line 256

Deprecated: Creation of dynamic property WPForms\WPForms::$frontend is deprecated in /home3/thehackw/public_html/wp-content/plugins/wpforms-lite/src/WPForms.php on line 257

Deprecated: Creation of dynamic property WPForms\WPForms::$process is deprecated in /home3/thehackw/public_html/wp-content/plugins/wpforms-lite/src/WPForms.php on line 258

Deprecated: Creation of dynamic property Automattic\Jetpack\Sync\Queue::$random_int is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/jetpack_vendor/automattic/jetpack-sync/src/class-queue.php on line 40

Deprecated: Creation of dynamic property Automattic\Jetpack\Sync\Queue::$random_int is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/jetpack_vendor/automattic/jetpack-sync/src/class-queue.php on line 40

Deprecated: Creation of dynamic property Jetpack_Likes::$in_jetpack is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/likes.php on line 54

Deprecated: Creation of dynamic property Jetpack_Likes_Settings::$in_jetpack is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/likes/jetpack-likes-settings.php on line 14

Deprecated: Creation of dynamic property Jetpack_Likes::$settings is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/likes.php on line 55

Deprecated: Creation of dynamic property Jetpack_Likes_Settings::$in_jetpack is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/likes/jetpack-likes-settings.php on line 14

Deprecated: Creation of dynamic property Jetpack_Comment_Likes::$settings is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/comment-likes.php on line 45

Deprecated: Creation of dynamic property Jetpack_Comment_Likes::$blog_id is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/comment-likes.php on line 46

Deprecated: Creation of dynamic property Jetpack_Comment_Likes::$url is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/comment-likes.php on line 47

Deprecated: Creation of dynamic property Jetpack_Comment_Likes::$url_parts is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/comment-likes.php on line 48

Deprecated: Creation of dynamic property Jetpack_Comment_Likes::$domain is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/comment-likes.php on line 49

Deprecated: Creation of dynamic property Jetpack_Publicize::$modules is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/publicize.php on line 36

Deprecated: Creation of dynamic property Automattic\Jetpack\Publicize\Publicize_UI::$in_jetpack is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/publicize.php on line 99

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/calypsoify/class-jetpack-calypsoify.php on line 130

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/calypsoify/class-jetpack-calypsoify.php on line 131

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/calypsoify/class-jetpack-calypsoify.php on line 135

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/calypsoify/class-jetpack-calypsoify.php on line 136

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/calypsoify/class-jetpack-calypsoify.php on line 137

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/simple-payments/simple-payments.php on line 337

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/simple-payments/simple-payments.php on line 338

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/simple-payments/simple-payments.php on line 345

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/simple-payments/simple-payments.php on line 347

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/simple-payments/simple-payments.php on line 349

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/simple-payments/simple-payments.php on line 367

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/simple-payments/simple-payments.php on line 368

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/simple-payments/simple-payments.php on line 387

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/simple-payments/simple-payments.php on line 388

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/simple-payments/simple-payments.php on line 390

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/simple-payments/simple-payments.php on line 391

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/simple-payments/simple-payments.php on line 393

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/simple-payments/simple-payments.php on line 395

Deprecated: Creation of dynamic property AIOSEO\Plugin\Common\Utils\Database::$join is deprecated in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Utils/Database.php on line 1474

Deprecated: Creation of dynamic property AIOSEOExtend::$name is deprecated in /home3/thehackw/public_html/wp-content/plugins/aioseo-index-now/extend/init.php on line 50

Deprecated: Creation of dynamic property AIOSEOExtend::$function is deprecated in /home3/thehackw/public_html/wp-content/plugins/aioseo-index-now/extend/init.php on line 51

Deprecated: Creation of dynamic property AIOSEOExtend::$file is deprecated in /home3/thehackw/public_html/wp-content/plugins/aioseo-index-now/extend/init.php on line 52

Deprecated: Creation of dynamic property AIOSEOExtend::$minimumVersion is deprecated in /home3/thehackw/public_html/wp-content/plugins/aioseo-index-now/extend/init.php on line 53

Deprecated: Creation of dynamic property AIOSEOExtend::$levels is deprecated in /home3/thehackw/public_html/wp-content/plugins/aioseo-index-now/extend/init.php on line 54

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/elementor/core/editor/notice-bar.php on line 75

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/elementor/core/editor/notice-bar.php on line 79

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_StoreSchema.php on line 46

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_StoreSchema.php on line 50

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_StoreSchema.php on line 52

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_StoreSchema.php on line 56

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_StoreSchema.php on line 72

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_StoreSchema.php on line 114

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_StoreSchema.php on line 118

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_StoreSchema.php on line 119

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_StoreSchema.php on line 120

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_StoreSchema.php on line 121

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_StoreSchema.php on line 122

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_LoggerSchema.php on line 40

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_LoggerSchema.php on line 44

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_LoggerSchema.php on line 77

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_LoggerSchema.php on line 81

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_LoggerSchema.php on line 82

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/vendor/woocommerce/action-scheduler/classes/schema/ActionScheduler_LoggerSchema.php on line 83

Deprecated: Creation of dynamic property AIOSEOTranslations::$type is deprecated in /home3/thehackw/public_html/wp-content/plugins/aioseo-index-now/extend/translations.php on line 37

Deprecated: Creation of dynamic property AIOSEOTranslations::$slug is deprecated in /home3/thehackw/public_html/wp-content/plugins/aioseo-index-now/extend/translations.php on line 38

Deprecated: Creation of dynamic property AIOSEOTranslations::$apiUrl is deprecated in /home3/thehackw/public_html/wp-content/plugins/aioseo-index-now/extend/translations.php on line 39

Deprecated: Creation of dynamic property AIOSEOTranslations::$type is deprecated in /home3/thehackw/public_html/wp-content/plugins/aioseo-index-now/extend/translations.php on line 37

Deprecated: Creation of dynamic property AIOSEOTranslations::$slug is deprecated in /home3/thehackw/public_html/wp-content/plugins/aioseo-index-now/extend/translations.php on line 38

Deprecated: Creation of dynamic property AIOSEOTranslations::$apiUrl is deprecated in /home3/thehackw/public_html/wp-content/plugins/aioseo-index-now/extend/translations.php on line 39

Deprecated: Creation of dynamic property OMAPI::$blocks is deprecated in /home3/thehackw/public_html/wp-content/plugins/optinmonster/optin-monster-wp-api.php on line 440

Deprecated: Creation of dynamic property OMAPI::$revenue is deprecated in /home3/thehackw/public_html/wp-content/plugins/optinmonster/optin-monster-wp-api.php on line 444

Deprecated: Creation of dynamic property OMAPI_WPForms::$save is deprecated in /home3/thehackw/public_html/wp-content/plugins/optinmonster/OMAPI/WPForms.php on line 42

Deprecated: Creation of dynamic property Automattic\Jetpack\Sync\Queue::$random_int is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/jetpack_vendor/automattic/jetpack-sync/src/class-queue.php on line 40

Deprecated: Creation of dynamic property Automattic\Jetpack\Sync\Queue::$random_int is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/jetpack_vendor/automattic/jetpack-sync/src/class-queue.php on line 40

Deprecated: Creation of dynamic property AIOSEO\Plugin\AIOSEO::$settings is deprecated in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/AIOSEO.php on line 467

Warning: Cannot modify header information - headers already sent by (output started at /home3/thehackw/public_html/wp-content/plugins/wb4wp-wordpress-plugin-bluehost-india-production/vendor/vlucas/phpdotenv/src/Environment/AbstractVariables.php:13) in /home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Meta/Robots.php on line 87

Warning: Cannot modify header information - headers already sent by (output started at /home3/thehackw/public_html/wp-content/plugins/wb4wp-wordpress-plugin-bluehost-india-production/vendor/vlucas/phpdotenv/src/Environment/AbstractVariables.php:13) in /home3/thehackw/public_html/wp-includes/feed-rss2.php on line 8
<br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$term_id is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$title is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$description is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$keywords is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$canonical_url is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$og_title is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$og_description is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$og_object_type is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$og_image_type is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$og_image_url is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$og_image_width is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$og_image_height is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$og_image_custom_url is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$og_image_custom_fields is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$og_video is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$og_custom_url is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$og_article_section is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$og_article_tags is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$twitter_use_og is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$twitter_card is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$twitter_image_type is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$twitter_image_url is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$twitter_image_custom_url is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$twitter_image_custom_fields is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$twitter_title is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$twitter_description is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$robots_default is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$robots_noindex is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$robots_noarchive is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$robots_nosnippet is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$robots_nofollow is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$robots_noimageindex is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$robots_noodp is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$robots_notranslate is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$robots_max_snippet is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$robots_max_videopreview is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$robots_max_imagepreview is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$priority is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$frequency is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$images is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$videos is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$video_scan_date is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> <br /> <b>Deprecated</b>: Creation of dynamic property AIOSEO\Plugin\Pro\Models\Term::$local_seo is deprecated in <b>/home3/thehackw/public_html/wp-content/plugins/all-in-one-seo-pack-pro/app/Common/Models/Model.php</b> on line <b>167</b><br /> Machine Learning » TheHackWeekly.com https://thehackweekly.com For the Data Science Community Fri, 28 Jan 2022 10:40:58 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.2 https://i0.wp.com/thehackweekly.com/wp-content/uploads/2021/11/cropped-6_Linkedin_LOGO-1.png?fit=32%2C32&ssl=1 Machine Learning » TheHackWeekly.com https://thehackweekly.com 32 32 196194551 Top 10 Python Functions to Automate the Steps in Data Science https://thehackweekly.com/top-10-python-functions-to-automate-the-steps-in-data-science/?utm_source=rss&utm_medium=rss&utm_campaign=top-10-python-functions-to-automate-the-steps-in-data-science https://thehackweekly.com/top-10-python-functions-to-automate-the-steps-in-data-science/#comments Mon, 10 Jan 2022 00:00:19 +0000 https://thehackweekly.com/?p=3085 1. Introduction Whenever a new project is assigned to a data […]

The post Top 10 Python Functions to Automate the Steps in Data Science first appeared on TheHackWeekly.com.

]]>

Deprecated: Creation of dynamic property OMAPI_Elementor_Widget::$base is deprecated in /home3/thehackw/public_html/wp-content/plugins/optinmonster/OMAPI/Elementor/Widget.php on line 41

Deprecated: Creation of dynamic property Jetpack_Carousel::$localize_strings is deprecated in /home3/thehackw/public_html/wp-content/plugins/jetpack/modules/carousel/jetpack-carousel.php on line 523
Blog by Author: Swetank
Blog by Author: Swetank

- I am Centre Head at Centre For Sports Science (In association with the Department of Youth Empowerment And Sports, Government of Karnataka).

- Talks about Data Science, Machine Learning, Sports Science, Sports Analytics, Marketing Analytics, Tableau, and Python.

- I strongly believe in Sir Isaac Newton's words “What we know is a drop, what we don’t know is an ocean.”

1. Introduction

Whenever a new project is assigned to a data science team they must be worried about doing some sort of boring stuff again and again. At this moment I am talking about writing the same code blocks, it’s like a vicious cycle where the person has to do the same monotonous work again and again. Isn’t it interesting to convert these tasks into automation and rather investing the time in some useful tasks such as:

  1. Hyperparameter Tuning
  2. Feature Engineering
  3. Model Selection
  4. Business Insights

In this article we will be discussing python functions to rescue us from the same boring stuff. We are going to deep into the Top 10 Python Functions to Automate the Steps in Data Science – let’s go without further a ado. In order to automate the data science process, we will start to discuss the basics of python function and machine learning and move into the actual automation process.

2. Function in Python

A function is nothing but reusable code that helps in performing various tasks. It helps the data scientist or programmer to decompose the problem into small chunks of different modules. Thus, Python functions help in the reusability of code when required by calling it. Python provides you with an ‘n’ number of in-built functions to accomplish your concerned tasks.

Syntax: Python Function

def name_your_function( parameters ):
    "function_docstring"
    function_suite
    return [expression]

Parameters have a positional nature by default, and we have to inform them in the same order in which they were defined.

Making a Function Call

A function is defined by:

  1. Giving it a name,
  2. Specifying the arguments that must be included in the function, and structuring the code blocks.

Once a function’s fundamental structure is complete, you can call it from another function or directly from the Python prompt to run it.

3. Automation

The process of automating the tasks of applying machine learning to real-world issues is known as automated machine learning. From the raw dataset to the deployable machine learning model, Automated machine learning covers the entire pipeline and is offered as an AI-based answer to the ever-increasing issue of machine learning applications. Automated machine learning’s high level of automation enables non-experts to employ machine learning models and techniques without having to become machine learning professionals.

Automating the entire machine learning process has the added benefits of:

  1. Providing simpler solutions, 
  2. Faster generation of those solutions, and models that frequently outperform hand-designed models. 

In a prediction model, Automated machine learning was utilized to compare the relative relevance of each factor.

4. Let’s compare to the standard approach

Data Scientists in a typical machine learning application have a collection of input data points to train with. The raw data may not be in a format that can be used by all algorithms. Data Scientists need to use the following to make the data suitable for machine learning:

  1. Proper data pre-processing, 
  2. Feature engineering – feature extraction and feature selection.

Following these, they must choose an algorithm and optimize hyperparameters to improve their model’s prediction performance. Each of these phases might be difficult, making machine learning difficult to implement. 

Automated Machine Learning substantially simplifies these tasks!

5. How to automate boring stuff and save time? 

There is a saying doing a smarter way is more impactful than doing the hard way. You are a data scientist in XYZ company, you have been assigned to develop a product for your organization having x objectives. So, whenever you are designing the architecture of the data science project in your workspace;  you have to do the following tasks such as importing data, data preprocessing, building machine learning models, performance measurement of models, deployment, and much more.

In a typical end to end project, there are many ways, 2 common ways are,

  1. You will be having a compressed data file.
  2. The data will be stored in a relational database having multiple tables/ documents/ files.

In both instances, you have to gather the data, perform data preprocessing, and built the model in which you have to write the same sorts of multiple lines of code.

In this blog we will be discussing the following in context to function:

  1. Focusing on Data and Training
  2. Performing EDA (Exploratory Data Analysis)
  3. Building Machine Learning Models
  4. Prediction
  5. Model Deployment

5.1 Focusing on Data and Training

If you are assigned a task to build a model the first thought comes to in mind:

automate

 

Image by author

 

Let’s concentrate on what is both and how it is significant:

        1. Batch Learning 

  • The system cannot learn gradually in batch learning; it must be trained to utilize all the available data because this takes a long time and a lot of computing power, it’s usually done offline. 
  • The system is first trained, and then it is put into production, where it operates without having to learn anything new; it simply applies what it has learned. This is referred to as “offline learning.” 
  • Unless you want a batch learning system to learn about new data (such as a new type of spam), they must first train a new version of the system from scratch on the entire dataset (not just the new data), then terminate the old one and replace it with the new one. 
  • Moreover, the entire process of training, testing, and releasing a Machine Learning system can be automated, allowing even batch learning systems to adapt to changing conditions. Maintain current data and retrain a new version of the system when needed. 
  • Furthermore, training on the entire collection of data necessitates a significant amount of computer resources (CPU, memory, disc space, disc I/O, network I/O, and so on). It will cost you a lot of money if you have a lot of data and you automate your system to train from scratch every day. It may even be impossible to employ a batch learning method if the amount of data is enormous.

        2. Online Learning

  •  In online learning, the system is progressively trained back to back giving it data instances, either individually or in small groups known as mini-batches. Each learning phase is quick and inexpensive, allowing the system to learn about new data as it comes in. For systems that receive data in a continuous stream and must adapt to change quickly or autonomously, online learning is ideal.
  • Online learning methods can also be used to train systems on massive datasets that are too large to fit in the main memory of a single machine (this is called out-of-core learning). The algorithm loads a portion of the data, performs a training step on that data, and then continues the process until all of the data has been processed.

While building batch learning as mentioned above it has to be trained with available data to do so here is the below code.

Note: All CODE BLOCKS are BEST viewed in DESKTOP MODE, kindly switch for BEST experience!

As we have to work on available data so no such automation is required and it is a one time process, however, if you want to train the model progressively for example in the stock price sector you need to update the data in each instance that is a type of online learning as mentioned above:

5.2 Performing EDA (Exploratory Data Analysis)

automate

Image by Author

Exploratory data analysis, or EDA for short, is a term coined by John W. Tukey for describing the act of looking at data to see what it seems to say. It assists data scientists in determining how to best manipulate data sources to obtain the answers they require, making it easier for them to find patterns, test hypotheses, and verify assumptions.

EDA is primarily used to examine what data can disclose outside of formal modeling or hypothesis testing tasks, and to gain a better knowledge of data set variables and their interactions. It might also assist you in determining whether the statistical techniques, you’re contemplating for data analysis are suitable.

Why EDA?

The primary goal of EDA is to assist in the analysis of data before making any assumptions. It can aid in the detection of evident errors, as well as a better understanding of data patterns, the detection of outliers or unusual events, and the discovery of interesting relationships between variables. Exploratory analysis can be used by data scientists to guarantee that the results they create are accurate and appropriate to any targeted business outcomes and goals. EDA also assists stakeholders by ensuring that they are asking the appropriate questions. Standard deviations, categorical variables, and confidence intervals can all be answered with EDA. Following the completion of EDA and the extraction of insights, its features can be applied to more advanced data analysis or modelling, including machine learning.

Types

There are 4 primary types of EDA:

1. Univariate

This is the simplest method of data analysis where the data being investigated only has one variable. Because it is a single variable, it does not deal with causes or relationships. The basic purpose of the univariate analysis is to describe and detect patterns in the data.

2. Univariate with graphs

Single variable Visualisation to seek more information and patterns. As a result, graphical methods are required. Common types of univariate graphics include:

  1. Stem-and-leaf plots
  2. Histograms
  3. Box plots

3. Multivariate

Multivariate data arises from more than one variable. Multivariate EDA techniques generally show the relationship between two or more variables of the data through cross-tabulation or statistics.

4. Multivariate with graphs

Graphics are used to demonstrate relationships Other common types of multivariate   graphics include:

  1. Scatter plot
  2. Multivariate chart
  3. Run chart
  4. Bubble chart
  5. Heat map

To do so we can create a python function:

In python, pandas have an in-built function to look and understand the attributes, sometimes it will be too clumsy to write each line of code and run it, here is how you can write a function to make work easier.

However, pandas_profiling and sweetviz are two libraries that perform exploratory data analysis with few lines of code. Also, once the EDA is done, data preprocessing and cleaning is initiated that will comprise of but are not limited to:

  • Conversion of Categorical Data into Numerical ones
  • Replacing missing values or unwanted values
  • Removing Outlier
  • Scaling or Normalization of data

NOTE: Business Context should be taken into consideration while performing the above measures.

5.3 Building Machine Learning Models

Machine learning (ML) is a sort of artificial intelligence (AI) that allows applications to improve their prediction metric over time without being expressly designed to do so. In order to forecast new output values, machine learning algorithms use historical data as input.

automate

Photo by Kevin Ku on Unsplash

Building a machine learning model is a strenuous exercise, each time you have to start from scratch, with the use function you can simply pass whichever model you want to work on.

Why?

Whenever you are building a model each time you have to write the same few lines of code, this will be going to fetch extra seconds, these seconds could be used to improve model performance in different ways.

automate

However, in the above model function, I am just returning the accuracy of each model, we can return other performance metrics also. Such as in the case of the classification model:

In the above scenario, as it is a classification model precision, recall and F1-score is returned as seen in the below output:

Automate

5.4 Prediction

Automate

Photo by Bud Helisson on Unsplash

When estimating the likelihood of a given result, such as whether or not a customer would churn in 30 days, “prediction” refers to the output of an algorithm after it has been trained on a previous dataset and applied to new data. For each record in the new data, the algorithm will generate probable values for an unknown variable, allowing the model builder to determine what that value will most likely be.

The term “prediction” can be deceptive. In some circumstances, such as when utilizing machine learning to pick the next best move in a marketing campaign, it means you’re forecasting a future outcome. Other times, the “prediction” concerns, for example, whether or not a previously completed transaction was fraudulent. In that situation, the transaction has already occurred, but you’re attempting to determine whether it was legitimate, allowing you to take necessary action.

Why are Predictions Important?

Machine learning model predictions allow organizations to generate very accurate guesses about the likely outcomes of a query based on historical data, which might be about anything from customer attrition to possible fraud. These supply the company with information that has a measurable business value. 

For example, if modelling predicts that a client is likely to churn, the company can reach out to them with tailored messaging and outreach to prevent the customer from leaving.

To do so we can create a python function

6. Model Deployment

It’s quite difficult for laymen to understand or infer the machine learning code and output. In this section, we will be discussing the function that will enable the deployment of the machine learning model.

Deployment is the process of integrating a machine learning model into an existing production environment to make data-driven business decisions. For doing so, we have to save our code file as your_file.pickle using python library pickle and joblib and save the column as a JSON file.

Henceforth, we have two create two python files that will be util.py file (a set of simple Python functions and classes that shorten and simplify common tasks)and app.py (contains the application’s code, where you create the app and its views) and later on you have to create a frontend using HTML, CSS, and JS.

However, if you want to pass categorical data that you can do using a dictionary.

 7. Top 10 Python Functions to Automate the Steps in Data Science

Key Takeaways are

  • Basics of Reusable Python Functions
  • Automation Significance
  • Batch vs Online Learning Understanding
  • Exploratory Data Analysis Understanding

8. Benefits of employing Python functions

  • Reducing code duplication
  • Taking big problems and breaking them down into smaller chunks
  • Increasing the code’s clarity
  • Reusability of code
  • Readability of code
  • Information is being kept hidden.

9. Conclusion

With the continuous inflow of data in the world, in fact, all the sectors whether it be finance, sports, marketing, supply chain or social media, the task of data scientists have been increasing drastically and the challenges increase more if you are a solo data scientist in your firm. The development of machine learning products has become more complex than ever due to the increment of volume and varieties of data. Working with data is tedious work and doing the same sorts of code, again and again, can affect the quality.

 With automation, the following benefits can be achieved:

  • Time Management: You can easily segregate your time where to invest and how much, with the above codes you have to only pass several models, the other tasks can be automated.
  • Trade-off: There is always a trade-off about how much time you want to invest in Exploratory Data Analysis as it is a never-ending task when the data is on a larger scale, with automation you can easily pass the features or call the function to do so.
  • Business Insights: Time is money, with getting more minutes in your bucket you can drill down more into the data to get deeper business narratives or insights.

Automate

 

Image by Author

 

With the usage of the above code snippets, you can develop a package for the XYZ Team and that could be used by all of the members of that team and help in automation.

If you liked this Blog, leave your thoughts and feedback in the comments section, See you again in the next interesting read! 

Happy Learning!

Until Next Time, Take care!

– By Swetank 

Most Viewed Blog in our website is mentioned below:

COVER_Action_Function

“8 Most Popular Types of Activation Functions in Neural Networks”here

 

The post Top 10 Python Functions to Automate the Steps in Data Science first appeared on TheHackWeekly.com.

]]>
https://thehackweekly.com/top-10-python-functions-to-automate-the-steps-in-data-science/feed/ 2 3085
Logistic Regression: Best Walkthrough With Python is Here Now https://thehackweekly.com/logistic-regression-best-beginner-friendly-walkthrough-with-python-out-there/?utm_source=rss&utm_medium=rss&utm_campaign=logistic-regression-best-beginner-friendly-walkthrough-with-python-out-there https://thehackweekly.com/logistic-regression-best-beginner-friendly-walkthrough-with-python-out-there/#comments Thu, 02 Dec 2021 18:00:36 +0000 https://thehackweekly.com/?p=2668 1. Logistic Regression – Introduction We are going to start with […]

The post Logistic Regression: Best Walkthrough With Python is Here Now first appeared on TheHackWeekly.com.

]]>
Blog by Author: Sayantan
Blog by Author: Sayantan

- I am a Machine Learning Engineer whose major interest lies in Computer Vision.

- Skills include Python, Machine Learning and Computer vision, Pytorch, Django, and Flask.

- Works for a company called Averyx Group based in Dubai as a AI/ML consultant.

- At the same time, I do some freelance projects and write blogs as well.

1. Logistic Regression – Introduction

We are going to start with what logistic regression is? 

  • Logistic Regression, is used in statistics to estimate (guess) the probability of an event occurring provided we have some previous data, which means to predict a data value based on prior observations of a data set.
  • Logistic Regression works with binary data, where either the event happens (1) or the event does not happen (0).
  • It is named ‘Logistic Regression’ because its underlying technique is quite the same as Linear Regression. It is a simple machine learning algorithm that is most commonly used for classification problems in the Industry.

Note: The main aim of the blog is to get you to code and use it in predictions while knowing the theory behind it, so you don’t miss out on both theory and practical coding.

From here, the first question that comes to our mind is, what’s wrong with linear regression for a classification problem, Right?

Well, there are a couple of problems and let’s discuss them briefly:

1.1 A slightly far off input disturbs the entire algorithm

I will explain this with a hypothetical scenario. Imagine a problem where we are to predict whether a tumour is benign or malignant. In that case if we plot a best fit line using linear regression then it will look somewhat like this.

Logistic - 2 

Image by Author
 

Clearly from here, you can say that the best fit line is not a good threshold to predict whether a tumour is benign or malignant. This is a simple way to understand why linear regression is not suitable for classification tasks as the inputs are in zero and one.

1.2 The predicted value of y is much larger than 1 or much lower than 0

Now we have established why linear regression fails for classification problems, so let’s get straight into logistic regression. Before we do anything, we have to first understand how to interpret the output from the logistic regression model.

This logistic regression model will predict a output between the range of 0 and 1, 0 < y_pred< 1.

Here, y_pred is the estimated probability of getting y = 1 if the input is x.

For example:

Suppose the value of y_pred  is 0.9, then its interpretation is that the probability of y being 1 is 90%.

Mathematically, hθ (x)= P(y=1| x ;θ),  hθ (x) = y_pred

Symbols used: _ (underscore) means subscript and ^ (power cap) means superscript

2. Computing the Predicted Output

Now with all that being said, let’s compute the predicted output. We will start this quest by taking a look at the logistic regression equation.

Logistic - Equation

Here xand x2 are the input features. For example, let’s say we are predicting whether a tumour is malignant or benign, then xwill be the area of the tumour and x2 will be the perimeter of the tumour. There will be many more input features as well but first we will continue with these two features.

Next comes wand w2. 

These are the weights associated with the logistic regression model. The input value is fixed for a particular datapoint, so the only way to change the value of the output is by tweaking the weights and the biases ( bias is the term b here, but we will discuss that later ) only. 

For example,

If our model is predicting the output probability to be 0.10, which is suggesting that the probability of the tumour being benign is high, but in reality it is malignant, then we can change the value of the weights and biases to make the output probability to be above 0.5. 

Now what should be the initial value of the weights ? We can initialise the weights to be of any value from negative infinity to positive infinity in theory, but for practicality we have to keep in mind a few things. What are these few things ? We will discuss that a little later. But for now keep in mind that we should not initialize the weights very high or very low. It can be around the range of [-0.5,0.5] or [-1,1] or (0,1].

Now we will discuss the bias or the term b in the equation. Bias is added to the equation so that the decision boundary does not pass through the origin every time. It is much like the y-intercept in the equation of a straight line. If the y-intercept is not present or is zero then the line always passes through the straight line. It is generally initialized with the value of zero but its value changes to some random number that gives the best output after gradient descent ( we will discuss gradient descent in depth in this article as well ).

Now we jump into the next step of computing the predicted output. As you can imagine, all these values like x, w, etc have random values and therefore the value of z will not be in the range of [0,1] or [-1,1] ( considering -1 to be the opposite case ). So we pass this value of z to a function known as activation function which decreases or increases the value of z to the range of [0,1] or [-1,1] and that we can accept as our predicted output. I will discuss some activation functions and that will make things clearer.

COVER_Action_Function

“8 Most Popular Types of Activation Functions in Neural Networks”here

2.1 Sigmoid Activation Function

The sigmoid activation function looks somewhat like this : φ(z) = 1/(1+e^(-z))

If you compute the range of the function you will find it to be (0,1). And for z = 0 the function returns 0.5. The behaviour of the function will be more clear from this Graph.

Logistic - 3


Image courtesy

The graph shows that if the value of z is very low then the output is generally zero and if it is very high then the output is generally one. From here we can also get a idea that if the value of the weights are very low then the value of z will be very low and therefore the output will always be very close to 0 and when the value of the weights are very high then the value of z is very high and the output will always be one. This is one of the reasons why the value of the weights should not be in extremes.

2.2 Tanh Activation function

The Tanh activation function equation looks something like this : 

Logistic 4


Image courtesy

It is very evident from the graph that the range of the tanh function is between [-1,1] and has a value of 0 at z = 0. Also the nature of sigmoid and tanh curve are quite similar. Now a question arises if we have Sigmoid then what is the need of tanh ? In case of tanh the data is centred around zero and the negative inputs will be mapped strongly negative and the positive input will be mapped strongly positive.

5 most used Activation functions are

  1. Sigmoid activation Function: Equation – φ(z) = 1/(1+e^(-z))
  2. Tan h activation Function: Equation – σ(z) = ((e^z – e^(-z))/(e^z+ e^(-z))
  3. ReLu activation Function: Equation – y = max(0,x)
  4. Leaky Relu activation Function: Equation – y = max(0.01*x,x)
  5. Softmax activation function: Equation – softmax(x_i) = e^x/∑_j e_j

Finally we have summarized the entire process through this image. Take a look at this for getting the complete picture.

Logistic 5

Image by Author

3. Understanding Loss and Cost Function

 We have our predicted output in our hand right now and also the original output from the data. But we realise that our model has not done a very good job and has got most of the predictions wrong. After  thinking a little we realise that this is quite certain as the weights and biases had absolutely no correlation to the original output. So we now want to convey to the model that it has performed very poorly and for that we need something to quantify the mistake of the model. This is where the loss function comes into the picture. There are many loss functions out there but the two that are mostly used are Mean Squared Error loss function, about which you would probably know if you have studied Linear Regression and the second one is Cross Entropy Loss function which is used in case of Logistic Regression and we will discuss that here.

I will start explaining cross entropy loss function directly from the equation:

Cost Function Equation

Here, y = original output and y^ = predicted output.

We know that original output or y can have only two values either one or zero, for example the tumour can be either benign or malignant. 

If we plug y = 0 in the above equation it reduces to : Loss = -log( 1-y ̂) whose graph looks somewhat like this:

Logistic - 6

Image by Author

Therefore, if the model predicts 1 while the true label is 0, then according to the loss function the error is infinity and as the prediction approaches 0 from 1, the error gradually decreases and finally, if the model predicts 0 then the error is zero as the true label is zero.

And if we plug y = 1, then the equation reduces to -log(y ̂), whose graph looks somewhat like this:

Logistic - 7

Image by Author

Here, if we analyze this graph we will find out that when the predicted value of y is 0 and the true label of y is 1, then the error is again infinity as the prediction is completely wrong and as the prediction approaches 1 from 0, the error will gradually decrease and will finally become 0 if the predicted value is 1 which is also the true label. Mind you, the predicted output is basically a probability and will be any value between 0 and 1, not exactly 0 or 1.

With this we have got an idea of how the loss function actually works, how it is able to actually tell the model about its performance but all this discussion above was about one data point and now we will have to extend this entire thing for all the data points and compute the overall loss so we can get a better picture of how the model is. For this we introduce something known as the cost function. In the cost function, we take the sum of all the losses from all the data points and then divide it by the total number of data points. I will introduce the equation of the cost function and it will make it clearer.

Cost_Function - 1

In other words we take the average of all the losses that we computed from all the data points. I hope this entire thing made some sense. If not, then I will suggest you to read this again or take a look at some other articles and lectures for clarity. Here we end our discussion about the cost and loss function and we will move onto the next step.

Some of my suggested lectures and blogs are:

  1. Youtube Video on logistic regression – Here.
  2. Youtube Video on cost function – Here. 
  3. Coursera course – Here.

4. Optimising Weights Using Gradient Descent

After we are done computing the loss and the cost function, we now have to reduce the value of the cost function ( Not if we have a very less cost in the first attempt itself, but that is almost impossible). So now is the time to change the weights properly so that the
predicted output is close to the original or true output and therefore the value of cost function is reduced or can be minimised as much as possible. For this we will be using the gradient descent algorithm. The aim of gradient descent is to optimise the value of all the weights which in this case is w1 and w2 and the biases as well.
The gradient descent algorithm looks somewhat like this:

Capture 3 Logistic
Here, w1 = one of the weights in the model
:= is update equals
α = learning rate
j = cost function

Until now, we knew that cost function is a function of y and y_pred, but I have written it to be a function of y and w1. This is because y_(pred )is a function of w1,w2 and b but we will consider w2 and b constant for this case and we will do partial differentiation with respect to w1. That is why I have written it to be a function of w1.

If we plot the graph of cost function with w1 or weights of the model, we will find out that it is a parabolic curve just like the one shown below. I am skipping the mathematics of it in this article. Now, we know that taking a derivative is basically drawing a tangent at a point in the curve where the derivative is taken. With all that being said, let’s start breaking down the gradient descent equation. w1 is the initial weight that is the one we initialised. Let’s assume that w is the point for which the graph will have its minima and w1 is a point on the curve which is not equal to  _w, therefore j(y,w1)> j(y,▁w). Refer to the graph below if you are having problems with visualizing.

Logistic - 8


Image courtesy

Now, if we take derivative ( we are taking the derivative as in the equation of gradient descent we need the derivative of cost function ) at w1, the value of the derivative will be positive. Why ? This is because tanθ,where θ is the angle made by the tangent with the x-axis therefore is acute and so tan θ will be positive. α,Here is the learning rate which is positive and is generally less than 1. I will discuss its significance a little later. 

Now, the equation looks something like this: w1 := w1 – (positive number)*(positive number) , 

that is the value of w1 will slightly decrease and will be updated. Now this entire process will continue until the minima is reached where the value of the derivative will be zero and so w1 will not change anymore. If the value of α is high then the value of w1 might get lower than w_ was we are subtracting a big number from w1. So, if we keep the value of α high then it might overshoot the minima and the minimum point may never reach instead we get to a point where the cost increases than before.

Now if we keep the value of α very small then the number getting subtracted from w1 will be very small and we will take a very small step towards the minima. The steps will get even smaller as we reach closer to the minimum as the value of the gradient there is also very small. Therefore it will become a very very long process. To tackle this kind of problems with the Gradient descent we have made some modifications to the gradient descent algorithms, which I will not discuss in this article but feel free to research about them. In reality we generally fix a number of iterations like 500 or 2000 for gradient descent after which we stop it.

This entire process is repeated for all the weights and biases until we get the minima for all of them.

I hope I was able to show you the beauty behind logistic regression. It might be a little difficult to grasp the entire thing in one read if you have never studied any of these things before. But hopefully, if you persist and read it a few more times, it will get clearer.

5. Coding Logistic Regression from Scratch

We have understood the Theory of logistic regression, but now we want to use it in practice. 

First, let’s build a logistic regression model completely from scratch to make predictions on fake data and then we will use the scikit- learn library to build a logistic regression model and make predictions on the same data and will finally compare the accuracy as well. 

In the code below, you can also visualize how the values from the cost function changed as weights and biases were changed. I have also included a decision boundary to give you an intuition about how the predictions are done  

Link to Full Code and DatasetHere.

Now, you might be wondering how to use this in a real world dataset?

So, I have added a example code to make prediction on the Pima Indians Diabetes Dataset using Logistic Regression.

However, I have not done any feature engineering or data visualization or parameter optimization. In case you are interested in improving the accuracy than what we have got, try doing some optimisations and feature engineering.

6. Theory of Neural Networks

6.1 Computing output of a single layer Neural Network

Now we have understood the working of a logistic regression model and we will use that knowledge to get through to a simple neural network. A neural network is nothing but many logistic regression units connected to each other in different layers. Like in this case, the hidden layer has 3 Logistic Regression units and they are attached to each other. Now we will dive in to see how to compute the output of this neuron below.

Logistic - 9

                                                                                      Image by Author

Each Lr unit is represented like this :- a_y^([x]), here x is a number that represents the i^thhidden layer which in this case is 1 as there’s only one hidden layer and y is the i^th neuron in that hidden layer, like for the first one ( from the top to bottom ) the value of y will be 1 for the second one will be 2 and for the third one will be 3.

The first one will be represented as a_1^1, second one as a_1^2and the third one as a_1^3.
The function of these neurons in the hidden layer is to pass a value to the neuron at the output layer. Then in the output layer also, a logistic regression unit is used to compute the output.

The y_(pred) is calculated like this:
Capture_4
the weights in this case are not similar to the weights that we used initially during the calculation of the neuron units in the hidden layers. I hope this gives a good intuition of how a prediction is computed in an artificial neural network. The example that we have taken here is one of a shallow neural network but as we increase the number of hidden layers, a deep neural network is formed.

6.2 Backpropagating for optimizing weights

Once we are done computing the predicted values, again it is time to calculate the cost function for the neural network to see how accurate our neural network is. A cost is calculated for all the logistic regression unit in the neural network and then the derivative of the function j(θ) is taken with respect to w_i to find the best value of w_i. 

The entire mathematical calculation for the back propagation of neural nets is a little bit complicated and therefore we are not discussing it here but we will try to cover it in upcoming blogs.

And with this we end the theory of neural networks. I hope this has given some clarity about the behind the work of neural networks.

To code a neural network we will need a library like tensorflow or pytorch. Although it is absolutely possible to code it from scratch using numpy and pandas but that is too complicated and we will have to compute the derivative of many functions, that is why we will be needing pytorch or tensorflow. How I wish to make a blog that discusses the matrix way of looking at neural networks and then cover the code part of it.

7. Conclusion

I hope the blog has given you a good idea about logistic regression and how it works.

The blog has taken you from understanding the processes and calculations behind the logistic regression algorithm and giving a brief demo how to use it practically to the basic working of neural networks. What goes behind a neural network and what are calculations involved in it.

If you liked this Blog, leave your thoughts and feedback in the comments section, See you again in the next interesting read! 

Happy Learning!

Until Next Time, Take care!

– By Sayantan 

The post Logistic Regression: Best Walkthrough With Python is Here Now first appeared on TheHackWeekly.com.

]]>
https://thehackweekly.com/logistic-regression-best-beginner-friendly-walkthrough-with-python-out-there/feed/ 28 2668
A Comprehensive History of AI is here: First AI Model to Latest Trends https://thehackweekly.com/a-comprehensive-history-of-ai-is-here-first-ai-model-to-latest-trends/?utm_source=rss&utm_medium=rss&utm_campaign=a-comprehensive-history-of-ai-is-here-first-ai-model-to-latest-trends https://thehackweekly.com/a-comprehensive-history-of-ai-is-here-first-ai-model-to-latest-trends/#comments Thu, 25 Nov 2021 00:00:04 +0000 https://thehackweekly.com/?p=2306 Introduction AI is nowadays seen, heard, and used everywhere in day-to-day life […]

The post A Comprehensive History of AI is here: First AI Model to Latest Trends first appeared on TheHackWeekly.com.

]]>
Blog by Author: Kabilan
Blog by Author: Kabilan

- I’m a Tech blogger, AI Engineer, and Blockchain Enthusiast.

- Has previously written blogs related to AI/ML and Blockchain Technology.

- Works as a Deep Learning Intern at a firm called “The Machine Learning Company”.

Introduction

AI is nowadays seen, heard, and used everywhere in day-to-day life right from helping in problem-solving to providing personalized recommendations based on our interests.

Recently you would have started seeing AI playing a major role in security surveillance too. In this fast-moving tech world, AI is likely to play a major role in the future in almost all the domains, however, the future is unpredictable, so let’s look into the following:

  • What is AI ?
  • What was the history of AI ?
  • How it has evolved currently (Co-Pilot and Codex) and a look into the future of AI !

What is Artificial Intelligence?

Artificial intelligence (AI) is the simulation of human intelligence processes by computer systems. 

TheHackWeekly

 

Image from Unsplash

 

Artificial intelligence (AI) is the simulation of human intelligence processes by computer systems. 

Some applications of AI are:

  1. Natural Language Processing (NLP), 
  2. Computer Vision (CV), 
  3. Expert Systems,
  4. Speech Recognition, and 
  5. Object Detection and Classification.

Before the boom of the AI era. Artificial Intelligence still existed as science fiction.

Artificial intelligence (AI) is a set of sciences, theories, and techniques including mathematical logic, statistics, probabilities, computational neurobiology, computer science that aims to imitate the cognitive abilities of a human being.

The Early Idea of machines with human-like intelligence dates back to 1872, in Samuel Butler’s novel Erewhon. The concept of AI has also been a crucial part of some sci-fi movies, for example, Director Ridley Scott gave AI an important role in most of his films like Prometheus, Blade Runner, and the Alien franchise, etc, the most relatable example is Terminator by James Cameron, in that  SkyNet is a fictional artificial superintelligence system that plays as an antagonistic force. You might also hear the myth stories of the future controlled by Artificial Intelligence, Who knows? it could even be possible sooner than we think, with robots like Sophia!

The Future is here with Codex and Copilot

1. OpenAI Codex

  • OpenAI codex is an Artificial Intelligence-based Natural Language Processing model developed by OpenAI. This helps in generating programming code given the comment about the required problem or a piece of code. GitHub Copilot is also powered by OpenAI. GitHub Copilot is one of the best use cases of OpenAI codex. 
  • Codex is a descendent of OpenAI’s GPT-3 model, a fine-tuned autoregressive model that can reproduce text more like a human. Upon GPT-3 – Codex is additionally trained with 159 GigaBytes of Python code from 54 million GitHub public repositories. 
  • To code with OpenAI codex users just need to give the command in English to this API. Codex is then just a complete piece of code for the given command. Currently, the codex is released as an open-sourced API but soon it may become as stated by the OpenAI in their document. 
  • You can register for codex by joining the waitlist.

2. GitHub Copilot  

  • Every programmer or developer loves to have a co-programmer to unite and work with, but the unfortunate situation is not all have a perfect pair. To solve this here comes GitHub Copilot. GitHub quoted this as “Your AI pair programmer”.
  • GitHub Copilot is a VS Code extension that can autocomplete and synthesize code based on our inputs like comments and function headers. Copilot can work well with the programming languages Python, JavaScript, Go, Ruby and Typescript. More than just suggesting code, GitHub Copilot also analyses and draws content from the code on which the user is working and suggests a piece of code, it also helps in creating test cases for Codes. 
  • Copilot is also used to write prose, like it suggests upcoming lines based on our previous works. GitHub Copilot’s code sometimes works well but it doesn’t always give more perfect code as it is trained from public repositories mostly it’s not always the repositories that contain well-explained and well-structured code. When It is used in production it might lead to different bugs as it does not always follow the best practices. 
  • As GitHub Copilot can generate code as well as phrases, sometimes while creating a function it also generates some weird copyright comments like the ones present on the copyrighted codes. GitHub Copilot is a great resource for fast coding, it might have some mistakes but those are solvable by users. 
  • Copilot is in its first version, its limitations may be rectified in the future version.
  • It’s a tool all programmers and developers can keep by their side. Try yourself

Latest Advancements

1. AlphaGo

In 2016, AlphaGo is the first computer program to beat the European Go champion (Fan Hui) and the world champion (Lee Sedol) then itself (AlphaGo Zero). AlphaGo was one of the complex games with 1080 possible outcomes, Go is known as the most challenging game for AI because of its complexity. This was one of the oldest games requiring multiple layers of strategic thinking.

2. OpenAI

OpenAI is an Artificial Intelligence research laboratory of parent company OpenAI Inc. This is also a competitor for DeepMind. The goal of OpenAI is to develop safe and user-friendly AI that benefits humanity. 

  • In 2015, OpenAI was formed by investors like Elon Musk, Sam Altman with  Greg Brockman, Ilya Sutskever, Wojciech Zaremba, and John Schulman; They pledged over a billion USD to the venture. 
  • OpenAI stated that they are ready to freely collaborate with other researchers and research institutions making its research and patents open to the public. 
  • In 2016, Their beta version “OpenAI Gym” was released. OpenAI Gym is a platform for research on  Reinforcement Learning.  OpenAI Gym can only be used with Python language. 
  • In 2018, Elon musk resigned his board seat and remained as a donor. 
  • In 2019, OpenAI received a 1 billion USD investment from Microsoft. 
  • In 2020, OpenAI announced GPT-3, an NLP model trained on trillions of words of different languages.

3. Sophia

History of AI - 12

 

Image when Sofia visited India 

 

  • Sophia, a humanoid AI robot developed by Hanson Robotics (Hong-Kong-based company). This is a social robot that uses AI to see people understand conversation and form relationships. This Robot was activated in February 2016 and Made its first public appearance in the mid of March 2016 at the Southwest Festival which happens in Texas, the United States. 
  • Sophia was the first-ever humanoid robot to get Saudi Arabia citizenship and also the first non-human to get a United Nations Title as it was named as the United Nations Development Programme’s first-ever Innovation Champion. Sophia imitates more human-like behavior compared to other humanoid robots. Sophia’s architecture includes intelligence software designed by Hanson Robotics, a chat system, and an AI system OpenCog designed for general reasoning. Sophia uses speech recognition technology from Alphabet Inc. and its speech synthesis was designed by Cerepoc. Sophia’s AI analyses the conversations and improves itself for future responses.

4. Google Duplex

  • In 2018, Google I/0 showed Google Duplex, a voice Assistant using natural conversation technologies. Google Duplex is completely an automated system that can make calls or book appointments for you with a voice more like a human voice than Generated robotic voice. 
  • Duplex is designed in a way that it can understand more complex sentences and fast speech. Initially, this feature was launched only to Google Pixel devices. 
  • The core of Google Duplex is an RNN build using TensorFlow Extended. Model for Duplex was trained on a corpus of anonymized phone conversation data. It uses the features from the input data, history of conversation to optimize itself. This allows people to interact with an AI more naturally without a robotic voice.

5. Open-sourcing

  • Google’s TensorFlow, a Python framework, was open-sourced in 2015 followed by Facebook’s PyTorch, a Python-based deep learning platform that supports dynamic computation graphs. The competition between TensorFlow and PyTorch gave more contributions to the AI and ML communities by giving more open-sourced updates. 
  • Many Custom libraries, packages, frameworks, and tools launched, this makes Machine learning to be easily accessible and understandable by all. On the other side, many researchers also open-sourced their work. This led all aspirants to learn it and apply AI in many different fields. The competition platforms like Kaggle are one of the key catalysts for increasing the growth of AI. 

Back to the History of AI

In the early days of World War, Germany used a safe encrypted way of sending messages to other German forces. This was called the Enigma Code. Alan Turing, a British Scientist built a machine called the Bombe machine that can decode the Enigma code. This code-solving machine becomes the initial foundation for machine learning.

1. Alan Turing – 1950

In 1950, an English mathematician, computer scientist, and theoretical biologist Alan Turing published A paper – “Computing Machinery and Intelligence”. In this paper, he asked the question “Can machines think?” – This question was very popular during those days.

History of AI - 2

 

Image from Alan

 

According to Alan, Rather than trying to determine if a machine is thinking, Turing suggests that we should ask if the machine can win a game, called the “Imitation Game”. This proposed “Imitation Game” was also known as the Turing Test.

2. First Artificial Neural Network – 1951

History of AI - 3

 

Image from Unsplash

 

  • In 1943, American Neurophysiologist Warren Sturgis McCulloch, in his paper “A Logical Calculus of the Ideas Immanent in Nervous Activity” stated the initial idea for neural networks. Warren Sturgis tried to demonstrate that a Turing machine can be implemented in a finite network of formal neurons.
  • In 1951, Inspired by McCulloch’s paper,  Marvin Minsky & his graduate student Dean Edmunds built the first Artificial Neural Network using 3000 vacuum tubes which stimulated 40 neurons. This first neural network machine was known as the SNARC (Stochastic Neural Analog Reinforcement Calculator) which imitated a rat finding its way around the maze. This was considered as one of the first towards building machine learning capabilities.

3. First Machine Learning Program – 1952

History of AI - 4

 

Image from Unsplash

 

In 1952, Arthur Samuel wrote the first Machine learning program for checkers (Game AI), and also he was the first to use the phrase Machine Learning. The IBM computer worked on this algorithm and improved it. The more it played, studying the moves having winning strategies and incorporating those moves into its program. This algorithm uses a minimax strategy for selecting the next move. This ML program remembers all the moves played by it and combines them to find the winning set of moves with the help of a reward function.

4. AI name coined – 1956

History of AI - 5

 

Image from Unsplash

 

The field of AI wasn’t formally founded until 1956, at a conference at Dartmouth College, in New Hampshire, where the term “Artificial Intelligence” was coined by John McCarthy, father of AI. He defines AI as the science and engineering of making intelligent machines. During those days it was also mentioned as computational intelligence, synthetic intelligence, or computational rationality which also make sense for AI. The term artificial intelligence is used to describe a property of machines or programs the intelligence that the system demonstrates.

5. IBM’s Deep blue – 1957

History of AI - 6

 

Image from Unsplash

 

  • In 1957, Herbert Simon, an economist, and sociologist prophesied that in a chess game the AI would beat a human in the next 10 years, before it happens AI then entered a first AI winter. After 30 years it was proven to be right. The operation of Deep Blue was based on a systematic brute force algorithm, where all possible moves were evaluated and weighted. 
  • In 1985, the development of Deep blue was initiated as a ChipTest project at Carnegie Mellon University. 
  • In 1987, this project was renamed Deep Blue thought Initially named Deep Thought.
  • Deep Blue became the first computer to defeat chess grandmaster Garry Kasparov in the first match of a six-game match in 1996. However, Garry Kasporov wins the game 4 – 2. Again Gary Kasporov beat Deep blue for the second time, this time deep blue was heavily upgraded and called deeper blue. This computer defeated (3.5 – 2.5) the reigning world champion in the decider match. Later it was dismantled by IBM.

6. Perceptron – 1957 

  • In 1957, Perceptron was developed by Frank Rosenblatt at Cornell Aeronautical Laboratory. 
  • Perceptron was an early artificial neural network enabling pattern recognition based on a two-layer computer learning network. ANN is a machine model that has been inspired by the functioning of the brain.
  • Rosenblatt perceptron is a binary single neuron model.  Its input integration is implemented by adding weighted inputs to the existing weights. If the output is greater than the threshold. The neuron gives the output as 1 else it’s set to 0. 
  • Rosenblatt perceptron can solve some linear classification problems.

7. First AI lab

History of AI - 7

 

Image from Unsplash

 

  • In 1959, the first AI lab was established at MIT. 
  • In 2003,  it was merged with the MIT Lab for computer science and called CSAIL. 
  • CSAIL is one of the most important labs in the world. The Lab lays the groundwork for image-guided surgery and NLP-based Web access, developed bacterial robots and behavior-based robots that are used for planetary exploration, military reconnaissance, and consumer devices. 
  • So far, 10 CSAIL researchers have won the Turing Award, which is also called the Nobel Prize of Computing.

8. ELIZA – 1965

  • In 1965, ELIZA was an early Natural language program developed by Joseph Weizenbaum at MIT Artificial Intelligence Lab. This was the first program that was capable of attending the Turing test. This was also the first chatterbot created before the term chatterbot was coined. ELIZA uses pattern matching and substitution methodology to make it look like understanding the conversation. This was also the first time creating the illusion of human-machine interaction. 
  • ELIZA works by recognizing keywords or phrases and matches with the pre-programmed responses. Thus it creates the illusion of understanding the conversation and responding. However ELIZA is incapable of learning new words only with interaction, the words need to be fed in its script.

History of AI - 8

 

Image from Unsplash

 

  • In 1972, ELIZA and another Natural language program PARRY were brought together for computer-only conversation. In that conversation, ELIZA spoke as a doctor and PARRY stimulated as a patient with schizophrenia.

9. AI Winter

  • AI Winter was a hard time for AI. AI encountered two major winters during 1974 to 1980 and 1987 to 1993. During this period much major funding was stopped, and the number of AI researches reduced drastically. 
  • AI winter was the result of over-hype on this technology and impossible promises by developers and high expectations from clients and end-users, finally, the disappointment led to the emergence of AI winter.– Still the case in 2021.
  • Failure of machine translation led to the start of it. This was research by the US government to automate the translation of Russian documents. The major difficulty in this problem is word-sense disambiguation. This was less accurate, slower than manual translation, and also more expensive. So, the NRC National Research Council decided to cut the funding after spending a huge amount of 20 million dollars. Machine translation was put on a halt. 
  • Later in the 21st century, many companies have retaken it and received some useful results like Google Translate, Yahoo Babel Fish. 
  • During the AI winter, many criticized the end of AI, but the interest and Enthusiasm in AI gradually increased in the 1990s and showed a dramatic increase at the beginning of 2012.

10. MYCIN

  • In  1975, MYCIN was an AI program developed at Stanford University, designed to assist physicians by recommending treatments for certain infectious diseases. 
  • The Divisions of Clinical Pharmacology and Infectious Disease collaborated with the members of the Department of Computer Science initiated the development of a computer-based system (termed MYCIN) that is capable of using both clinical data and judgmental decisions.

11. Convolution Neural Networks

  • In 1989, Yann Leun (AT&T Bell Labs) applied a backpropagation algorithm to create Convolutional Neural Network (CNN), a multilayer neural network). Convolutional Neural Network is a neural network having one or more convolution layers depending upon its use case. CNN’s are mostly used in pattern recognition and image processing. CNN, also called ConvNets, was first introduced by Yann LeCun, a computer science researcher in the 1980s. Yann LeCun built on the work of Kunihiko Fukushima, a Japanese scientist, invented a basic image recognition neural network called recognition. LeCun’s version of CNN, called as LeNet ( Named after LeCun ), was able to recognize handwritten digits.
  • In 1995, Scientist Richard Wallace developed ALICE  

( Artificial Linguistic Internet Computer Entity ), a chatbot. Richard was inspired by ELIZA built by Weizenbaum. In addition to ELIZA, ALICE had natural language data collection. ALICE also won the Loebner Prize for three years ( 2000, 2001, and 2004 ). However, ALICE didn’t pass the Turing test

  • In 1998, the ALICE program was rewritten in Java language. 
  • In 2002, Amazon replaced Human editors with Basic AI systems. This paved the path for others on how AI could be utilized in business.
  • In 2010, AI boomed because of the availability of access to massive amounts of data and heavy computing resources. CNN takes a new avatar and flourished in all subdomains of AI.  Until the 2010s CNN remained dormant because Training CNN needed lots of computing resources and lots of data to train on. 
  • In 2012, Alex Krizhevsky designed AlexNet that uses multilayered neural networks with the ImageNet dataset used to create complex convolution neural networks that perform different computer vision tasks.

12. Eugene Goostman, a chatbot 

  • Eugene Goostman is a chatbot that portrays a 13-year-old Ukrainian boy, developed by programmers – Vladimir Veselov (Russian), Eugene Demchenko (Ukrainian), and Sergey Ulasen (Russian) at Saint Petersburg in 2001. This chatbot also passed the Turing test after a long run. This was portrayed as a 13-year-old as in Veselov’s opinion a thirteen-year-old is “not too old to know everything and not too young to know nothing”, a clever idea to ignore its grammatical mistakes.

History of AI - 9

 

Image Source 

 

  • Eugene Goostman competed in several competitions, it also competed in the Loebner Prize and finished second in the years 2005 and 2008. On the hundredth birthday of Alan Turing Eugene Goostman, participated in the largest ever Turing test competition and convinced 29% of the judges that it is a human. 

History of AI - 10

 

This is a transcript of the conversation between Eugene and a judge that voted human.

 

  • In 2014, Eugene Goostman captured headlines for convincing 33% of the judges as if he was a real human during a Turing test. Event Organiser Kevin Warwick considered it passed the Turing test. This chatbot fulfills the prediction of Alan Turing as he predicted by 2000, machines would be capable of fooling 30% of human judges after five minutes of questioning. Soon it’s pass was questioned by critics that Eugene Goostman used personality quirks and humor to hide its non-human tendencies and lacks real intelligence.

13. AI Boom (Again) in  2010

AI boomed heavily after 2010. AI enters every field and makes a great impact on them. Andrew Ng, a computer scientist, quoted that, “AI is the new electricity”. The main factors for this boom are:

  1. Access to massive volumes of data. Thanks to the spread of the internet and IoT devices for generating Data.
  2. Discovery of Computer Graphics Card Processors (GPU’s) with high efficiency to accelerate the calculation of learning algorithms.

During the previous decade from 2010 – 2019, Many new AI-based companies emerge like 

  • Deepmind was initially founded in 2010 by Demis Hassabis, later acquired by Google for 400 million Euros in 2014.
  • OpenAI, founded by Elon Musk, is a non-profitable organization doing lots of research in Deep Reinforcement Learning. 
  • And many new algorithms, AI frameworks like TensorFlow, imageNet, Facenet, NIEL (Never Ending Image Learner), Transformers, BERT (Bidirectional Encoder Representations from Transformers), etc.,  were developed, which optimizes the growth of AI.

14. DeepMind

DeepMind is an Artificial intelligence company a subsidiary of Alphabet Inc. DeepMind was started in 2010 by Demis Hassabis, Shane Legg, and Mustafa Suleyman. This start-up begins as Demis Hassabis tries to teach AI technology to play old games with the help of neural networks, as old games are a bit primitive and simple compared to that of this generation of games. 

Key Points to note:

  • DeepMind uses Reinforcement learning to make AI learn the game and master it. 
  • The goal of the founders is to create a general-purpose AI that can be useful and effective for almost all use cases. 
  • This start-up attracts major Venture Capitalist firms like Horizon Ventures and Founders Fund. Many investors like Elon Musk, Peter Thiel also invested in DeepMind. 
  • Later in 2014, Google Acquired DeepMind for  400 million Euros. 
  • In that same year, DeepMind also received the “Company of the Year” award from Cambridge Computer Laboratory. 
  • In 2017, DeepMind created a research team to investigate AI ethics.

15. Siri

In 2011, Apple introduced Siri, It was the world’s first substantial voice-based conversational interface. The introduction of Siri led to the use of AI in virtual personal assistants. Many companies started creating their model of Virtual personal assistants as ALEXA for Amazon in 2014, Cortana for Microsoft, Google Assistants for Android.

16. Watson, “Jeopardy” winner

In 2011, the computer giant’s question-answering system Watson won the quiz show “Jeopardy!” by beating reigning champions, Ken Jennings and Brad Rutter. Watson was a room-sized machine made by IBM named after Thomas J Watson. IBM also made Watson a commercial product after its success.

17. Google X

History of AI - 11

 

Image from Unsplash

 

In 2012, Google X (Google’s search lab) will be able to have an AI recognize cats on a video. Google’s Jeff Dean & Andrew Ng trained a neural network that has More than 16,000 processors to detect cats and dogs using this algorithm. Later when Alphabet Inc. was created Google X was renamed as X. This lab is working on many interesting and futuristic projects.

Some of the exciting works at this lab are:

  • Google Glass – A research and development program to develop an augmented reality head-mounted display (HMD).
  • Waymo – Driverless car. After the success of this project, it emerged as a new company under Alphabet Inc. 
  • Google Brain – A deep learning research project. This project was considered one of the biggest successes under Google.
  • Google’s driverless – In 2014, Google’s driverless car passed Nevada’s self-driving test.  

Conclusion

Now, AI is touching new heights every day with new applications in different domains. The expectations of AI are also increasing day by day. The Technology once seemed it was over, has boomed again. For this advancement of AI major credits goes to the scientists who worked on it earlier. For every futuristic solution, we need to look up the past. Artificial intelligence acts as the main driver of emerging technologies like robotics, big data, and Internet of Things and it will continue to act as a technological innovator for the foreseeable future. I have taken you through a historic journey of AI. Hope you enjoyed it and got some new insights through this blog.

If you liked this Blog, leave your thoughts and feedback in the comments section, See you again in the next interesting read! 

Happy Learning!

Until Next Time, Take care!

– By Kabilan

Checkout this Blog on “8 Most Popular Types of Activation Functions in Neural Networks” here !

The post A Comprehensive History of AI is here: First AI Model to Latest Trends first appeared on TheHackWeekly.com.

]]>
https://thehackweekly.com/a-comprehensive-history-of-ai-is-here-first-ai-model-to-latest-trends/feed/ 13 2306