Search results for: classification algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3744

Search results for: classification algorithms

894 Satellite Derived Snow Cover Status and Trends in the Indus Basin Reservoir

Authors: Muhammad Tayyab Afzal, Muhammad Arslan, Mirza Muhammad Waqar

Abstract:

Snow constitutes an important component of the cryosphere, characterized by high temporal and spatial variability. Because of the contribution of snow melt to water availability, snow is an important focus for research on climate change and adaptation. MODIS satellite data have been used to identify spatial-temporal trends in snow cover in the upper Indus basin. For this research MODIS satellite 8 day composite data of medium resolution (250m) have been analysed from 2001-2005.Pixel based supervised classification have been performed and extent of snow have been calculated of all the images. Results show large variation in snow cover between years while an increasing trend from west to east is observed. Temperature data for the Upper Indus Basin (UIB) have been analysed for seasonal and annual trends over the period 2001-2005 and calibrated with the results acquired by the research. From the analysis it is concluded that there are indications that regional warming is one of the factor that is affecting the hydrology of the upper Indus basin due to accelerated glacial melting during the simulation period, stream flow in the upper Indus basin can be predicted with a high degree of accuracy. This conclusion is also supported by the research of ICIMOD in which there is an observation that the average annual precipitation over a five year period is less than the observed stream flow and supported by positive temperature trends in all seasons.

Keywords: indus basin, MODIS, remote sensing, snow cover

Procedia PDF Downloads 363
893 A Machine Learning Based Framework for Education Levelling in Multicultural Countries: UAE as a Case Study

Authors: Shatha Ghareeb, Rawaa Al-Jumeily, Thar Baker

Abstract:

In Abu Dhabi, there are many different education curriculums where sector of private schools and quality assurance is supervising many private schools in Abu Dhabi for many nationalities. As there are many different education curriculums in Abu Dhabi to meet expats’ needs, there are different requirements for registration and success. In addition, there are different age groups for starting education in each curriculum. In fact, each curriculum has a different number of years, assessment techniques, reassessment rules, and exam boards. Currently, students that transfer curriculums are not being placed in the right year group due to different start and end dates of each academic year and their date of birth for each year group is different for each curriculum and as a result, we find students that are either younger or older for that year group which therefore creates gaps in their learning and performance. In addition, there is not a way of storing student data throughout their academic journey so that schools can track the student learning process. In this paper, we propose to develop a computational framework applicable in multicultural countries such as UAE in which multi-education systems are implemented. The ultimate goal is to use cloud and fog computing technology integrated with Artificial Intelligence techniques of Machine Learning to aid in a smooth transition when assigning students to their year groups, and provide leveling and differentiation information of students who relocate from a particular education curriculum to another, whilst also having the ability to store and access student data from anywhere throughout their academic journey.

Keywords: admissions, algorithms, cloud computing, differentiation, fog computing, levelling, machine learning

Procedia PDF Downloads 111
892 Review of Theories and Applications of Genetic Programing in Sediment Yield Modeling

Authors: Adesoji Tunbosun Jaiyeola, Josiah Adeyemo

Abstract:

Sediment yield can be considered to be the total sediment load that leaves a drainage basin. The knowledge of the quantity of sediments present in a river at a particular time can lead to better flood capacity in reservoirs and consequently help to control over-bane flooding. Furthermore, as sediment accumulates in the reservoir, it gradually loses its ability to store water for the purposes for which it was built. The development of hydrological models to forecast the quantity of sediment present in a reservoir helps planners and managers of water resources systems, to understand the system better in terms of its problems and alternative ways to address them. The application of artificial intelligence models and technique to such real-life situations have proven to be an effective approach of solving complex problems. This paper makes an extensive review of literature relevant to the theories and applications of evolutionary algorithms, and most especially genetic programming. The successful applications of genetic programming as a soft computing technique were reviewed in sediment modelling and other branches of knowledge. Some fundamental issues such as benchmark, generalization ability, bloat and over-fitting and other open issues relating to the working principles of GP, which needs to be addressed by the GP community were also highlighted. This review aim to give GP theoreticians, researchers and the general community of GP enough research direction, valuable guide and also keep all stakeholders abreast of the issues which need attention during the next decade for the advancement of GP.

Keywords: benchmark, bloat, generalization, genetic programming, over-fitting, sediment yield

Procedia PDF Downloads 419
891 Performance Comparison of Different Regression Methods for a Polymerization Process with Adaptive Sampling

Authors: Florin Leon, Silvia Curteanu

Abstract:

Developing complete mechanistic models for polymerization reactors is not easy, because complex reactions occur simultaneously; there is a large number of kinetic parameters involved and sometimes the chemical and physical phenomena for mixtures involving polymers are poorly understood. To overcome these difficulties, empirical models based on sampled data can be used instead, namely regression methods typical of machine learning field. They have the ability to learn the trends of a process without any knowledge about its particular physical and chemical laws. Therefore, they are useful for modeling complex processes, such as the free radical polymerization of methyl methacrylate achieved in a batch bulk process. The goal is to generate accurate predictions of monomer conversion, numerical average molecular weight and gravimetrical average molecular weight. This process is associated with non-linear gel and glass effects. For this purpose, an adaptive sampling technique is presented, which can select more samples around the regions where the values have a higher variation. Several machine learning methods are used for the modeling and their performance is compared: support vector machines, k-nearest neighbor, k-nearest neighbor and random forest, as well as an original algorithm, large margin nearest neighbor regression. The suggested method provides very good results compared to the other well-known regression algorithms.

Keywords: batch bulk methyl methacrylate polymerization, adaptive sampling, machine learning, large margin nearest neighbor regression

Procedia PDF Downloads 282
890 A Communication Signal Recognition Algorithm Based on Holder Coefficient Characteristics

Authors: Hui Zhang, Ye Tian, Fang Ye, Ziming Guo

Abstract:

Communication signal modulation recognition technology is one of the key technologies in the field of modern information warfare. At present, communication signal automatic modulation recognition methods are mainly divided into two major categories. One is the maximum likelihood hypothesis testing method based on decision theory, the other is a statistical pattern recognition method based on feature extraction. Now, the most commonly used is a statistical pattern recognition method, which includes feature extraction and classifier design. With the increasingly complex electromagnetic environment of communications, how to effectively extract the features of various signals at low signal-to-noise ratio (SNR) is a hot topic for scholars in various countries. To solve this problem, this paper proposes a feature extraction algorithm for the communication signal based on the improved Holder cloud feature. And the extreme learning machine (ELM) is used which aims at the problem of the real-time in the modern warfare to classify the extracted features. The algorithm extracts the digital features of the improved cloud model without deterministic information in a low SNR environment, and uses the improved cloud model to obtain more stable Holder cloud features and the performance of the algorithm is improved. This algorithm addresses the problem that a simple feature extraction algorithm based on Holder coefficient feature is difficult to recognize at low SNR, and it also has a better recognition accuracy. The results of simulations show that the approach in this paper still has a good classification result at low SNR, even when the SNR is -15dB, the recognition accuracy still reaches 76%.

Keywords: communication signal, feature extraction, Holder coefficient, improved cloud model

Procedia PDF Downloads 124
889 Machine Learning Models for the Prediction of Heating and Cooling Loads of a Residential Building

Authors: Aaditya U. Jhamb

Abstract:

Due to the current energy crisis that many countries are battling, energy-efficient buildings are the subject of extensive research in the modern technological era because of growing worries about energy consumption and its effects on the environment. The paper explores 8 factors that help determine energy efficiency for a building: (relative compactness, surface area, wall area, roof area, overall height, orientation, glazing area, and glazing area distribution), with Tsanas and Xifara providing a dataset. The data set employed 768 different residential building models to anticipate heating and cooling loads with a low mean squared error. By optimizing these characteristics, machine learning algorithms may assess and properly forecast a building's heating and cooling loads, lowering energy usage while increasing the quality of people's lives. As a result, the paper studied the magnitude of the correlation between these input factors and the two output variables using various statistical methods of analysis after determining which input variable was most closely associated with the output loads. The most conclusive model was the Decision Tree Regressor, which had a mean squared error of 0.258, whilst the least definitive model was the Isotonic Regressor, which had a mean squared error of 21.68. This paper also investigated the KNN Regressor and the Linear Regression, which had to mean squared errors of 3.349 and 18.141, respectively. In conclusion, the model, given the 8 input variables, was able to predict the heating and cooling loads of a residential building accurately and precisely.

Keywords: energy efficient buildings, heating load, cooling load, machine learning models

Procedia PDF Downloads 73
888 Designing and Prototyping Permanent Magnet Generators for Wind Energy

Authors: T. Asefi, J. Faiz, M. A. Khan

Abstract:

This paper introduces dual rotor axial flux machines with surface mounted and spoke type ferrite permanent magnets with concentrated windings; they are introduced as alternatives to a generator with surface mounted Nd-Fe-B magnets. The output power, voltage, speed and air gap clearance for all the generators are identical. The machine designs are optimized for minimum mass using a population-based algorithm, assuming the same efficiency as the Nd-Fe-B machine. A finite element analysis (FEA) is applied to predict the performance, emf, developed torque, cogging torque, no load losses, leakage flux and efficiency of both ferrite generators and that of the Nd-Fe-B generator. To minimize cogging torque, different rotor pole topologies and different pole arc to pole pitch ratios are investigated by means of 3D FEA. It was found that the surface mounted ferrite generator topology is unable to develop the nominal electromagnetic torque, and has higher torque ripple and is heavier than the spoke type machine. Furthermore, it was shown that the spoke type ferrite permanent magnet generator has favorable performance and could be an alternative to rare-earth permanent magnet generators, particularly in wind energy applications. Finally, the analytical and numerical results are verified using experimental results.

Keywords: axial flux, permanent magnet generator, dual rotor, ferrite permanent magnet generator, finite element analysis, wind turbines, cogging torque, population-based algorithms

Procedia PDF Downloads 123
887 Sequential Pattern Mining from Data of Medical Record with Sequential Pattern Discovery Using Equivalent Classes (SPADE) Algorithm (A Case Study : Bolo Primary Health Care, Bima)

Authors: Rezky Rifaini, Raden Bagus Fajriya Hakim

Abstract:

This research was conducted at the Bolo primary health Care in Bima Regency. The purpose of the research is to find out the association pattern that is formed of medical record database from Bolo Primary health care’s patient. The data used is secondary data from medical records database PHC. Sequential pattern mining technique is the method that used to analysis. Transaction data generated from Patient_ID, Check_Date and diagnosis. Sequential Pattern Discovery Algorithms Using Equivalent Classes (SPADE) is one of the algorithm in sequential pattern mining, this algorithm find frequent sequences of data transaction, using vertical database and sequence join process. Results of the SPADE algorithm is frequent sequences that then used to form a rule. It technique is used to find the association pattern between items combination. Based on association rules sequential analysis with SPADE algorithm for minimum support 0,03 and minimum confidence 0,75 is gotten 3 association sequential pattern based on the sequence of patient_ID, check_Date and diagnosis data in the Bolo PHC.

Keywords: diagnosis, primary health care, medical record, data mining, sequential pattern mining, SPADE algorithm

Procedia PDF Downloads 376
886 Using Cyclic Structure to Improve Inference on Network Community Structure

Authors: Behnaz Moradijamei, Michael Higgins

Abstract:

Identifying community structure is a critical task in analyzing social media data sets often modeled by networks. Statistical models such as the stochastic block model have proven to explain the structure of communities in real-world network data. In this work, we develop a goodness-of-fit test to examine community structure's existence by using a distinguishing property in networks: cyclic structures are more prevalent within communities than across them. To better understand how communities are shaped by the cyclic structure of the network rather than just the number of edges, we introduce a novel method for deciding on the existence of communities. We utilize these structures by using renewal non-backtracking random walk (RNBRW) to the existing goodness-of-fit test. RNBRW is an important variant of random walk in which the walk is prohibited from returning back to a node in exactly two steps and terminates and restarts once it completes a cycle. We investigate the use of RNBRW to improve the performance of existing goodness-of-fit tests for community detection algorithms based on the spectral properties of the adjacency matrix. Our proposed test on community structure is based on the probability distribution of eigenvalues of the normalized retracing probability matrix derived by RNBRW. We attempt to make the best use of asymptotic results on such a distribution when there is no community structure, i.e., asymptotic distribution under the null hypothesis. Moreover, we provide a theoretical foundation for our statistic by obtaining the true mean and a tight lower bound for RNBRW edge weights variance.

Keywords: hypothesis testing, RNBRW, network inference, community structure

Procedia PDF Downloads 124
885 Heat Waves and Hospital Admissions for Mental Disorders in Hanoi Vietnam

Authors: Phan Minh Trang, Joacim Rocklöv, Kim Bao Giang, Gunnar Kullgren, Maria Nilsson

Abstract:

There are recent studies from high income countries reporting an association between heat waves and hospital admissions for mental health disorders. It is not previously studied if such relations exist in sub-tropical and tropical low- and middle-income countries. In this study from Vietnam, the assumption was that hospital admissions for mental disorders may be triggered, or exacerbated, by heat exposure and heat waves. A database from Hanoi Mental Hospital with mental disorders diagnosed by the International Classification of Diseases 10, spanning over five years, was used to estimate the heatwave-related impacts on admissions for mental disorders. The relationship was analysed by a Negative Binomial regression model accounting for year, month, and days of week. The focus of the study was heat-wave events with periods of three or seven consecutive days above the threshold of 35oC daily maximum temperature. The preliminary study results indicated that heat-waves increased the risks for hospital admission for mental disorders (F00-79) from heat-waves of three and seven days with relative risks (RRs) of 1.16 (1.01–1.33) and 1.42 (1.02–1.99) respectively, when compared with non-heat-wave periods. Heatwave-related admissions for mental disorders increased statistically significantly among men, among residents in rural communities and in elderly. Moreover, cases for organic mental disorders including symptomatic illnesses (F0-9) and mental retardation (F70-79) raised in high risks during heat waves. The findings are novel studying a sub-tropical middle-income city, facing rapid urbanisation and epidemiological and demographic transitions.

Keywords: mental disorders, admissions for F0-9 or F70-79, maximum temperature, heat waves

Procedia PDF Downloads 218
884 Forest Risk and Vulnerability Assessment: A Case Study from East Bokaro Coal Mining Area in India

Authors: Sujata Upgupta, Prasoon Kumar Singh

Abstract:

The expansion of large scale coal mining into forest areas is a potential hazard for the local biodiversity and wildlife. The objective of this study is to provide a picture of the threat that coal mining poses to the forests of the East Bokaro landscape. The vulnerable forest areas at risk have been assessed and the priority areas for conservation have been presented. The forested areas at risk in the current scenario have been assessed and compared with the past conditions using classification and buffer based overlay approach. Forest vulnerability has been assessed using an analytical framework based on systematic indicators and composite vulnerability index values. The results indicate that more than 4 km2 of forests have been lost from 1973 to 2016. Large patches of forests have been diverted for coal mining projects. Forests in the northern part of the coal field within 1-3 km radius around the coal mines are at immediate risk. The original contiguous forests have been converted into fragmented and degraded forest patches. Most of the collieries are located within or very close to the forests thus threatening the biodiversity and hydrology of the surrounding regions. Based on the vulnerability values estimated, it was concluded that more than 90% of the forested grids in East Bokaro are highly vulnerable to mining. The forests in the sub-districts of Bermo and Chandrapura have been identified as the most vulnerable to coal mining activities. This case study would add to the capacity of the forest managers and mine managers to address the risk and vulnerability of forests at a small landscape level in order to achieve sustainable development.

Keywords: forest, coal mining, indicators, vulnerability

Procedia PDF Downloads 370
883 Philippine Film Industry and Cultural Policy: A Critical Analysis and Case Study

Authors: Michael Kho Lim

Abstract:

This paper examines the status of the film industry as an industry in the Philippines—where or how it is classified in the Philippine industrial classification system and how this positioning gives the film industry an identity (or not) and affects (film) policy development and impacts the larger national economy. It is important to look at how the national government recognises Philippine cinema officially, as this will have a direct and indirect impact on the industry in terms of its representation, conduct of business, international relations, and most especially its implications on policy development and implementation. Therefore, it is imperative that the ‘identity’ of Philippine cinema be clearly established and defined in the overall industrial landscape. Having a clear understanding of Philippine cinema’s industry status provides a better view of the bigger picture and helps us determine cinema’s position in the national agenda in terms of priority setting, future direction and how the state perceives and thereby values the film industry as an industry. This will then serve as a frame of reference that will anchor the succeeding discussion. Once the Philippine film industry status is identified, the paper will then clarify how cultural policy is defined, understood, and applied in the Philippines in relation to Philippine cinema by reviewing and analyzing existing policy documents and pending bills in the Philippine Congress and Senate. Lastly, the paper delves into the roles that (national) cultural institutions and industry organisations play as primary drivers or support mechanisms and how they become platforms (or not) for the upliftment of the independent film sector and towards the sustainability of the film industry. The paper concludes by arguing that the role of the government and how government officials perceive and treats culture is far more important than cultural policy itself, as these policies emanate from them.

Keywords: cultural and creative industries, cultural policy, film industry, Philippine cinema

Procedia PDF Downloads 397
882 Programming without Code: An Approach and Environment to Conditions-On-Data Programming

Authors: Philippe Larvet

Abstract:

This paper presents the concept of an object-based programming language where tests (if... then... else) and control structures (while, repeat, for...) disappear and are replaced by conditions on data. According to the object paradigm, by using this concept, data are still embedded inside objects, as variable-value couples, but object methods are expressed into the form of logical propositions (‘conditions on data’ or COD).For instance : variable1 = value1 AND variable2 > value2 => variable3 = value3. Implementing this approach, a central inference engine turns and examines objects one after another, collecting all CODs of each object. CODs are considered as rules in a rule-based system: the left part of each proposition (left side of the ‘=>‘ sign) is the premise and the right part is the conclusion. So, premises are evaluated and conclusions are fired. Conclusions modify the variable-value couples of the object and the engine goes to examine the next object. The paper develops the principles of writing CODs instead of complex algorithms. Through samples, the paper also presents several hints for implementing a simple mechanism able to process this ‘COD language’. The proposed approach can be used within the context of simulation, process control, industrial systems validation, etc. By writing simple and rigorous conditions on data, instead of using classical and long-to-learn languages, engineers and specialists can easily simulate and validate the functioning of complex systems.

Keywords: conditions on data, logical proposition, programming without code, object-oriented programming, system simulation, system validation

Procedia PDF Downloads 200
881 A Prediction Model Using the Price Cyclicality Function Optimized for Algorithmic Trading in Financial Market

Authors: Cristian Păuna

Abstract:

After the widespread release of electronic trading, automated trading systems have become a significant part of the business intelligence system of any modern financial investment company. An important part of the trades is made completely automatically today by computers using mathematical algorithms. The trading decisions are taken almost instantly by logical models and the orders are sent by low-latency automatic systems. This paper will present a real-time price prediction methodology designed especially for algorithmic trading. Based on the price cyclicality function, the methodology revealed will generate price cyclicality bands to predict the optimal levels for the entries and exits. In order to automate the trading decisions, the cyclicality bands will generate automated trading signals. We have found that the model can be used with good results to predict the changes in market behavior. Using these predictions, the model can automatically adapt the trading signals in real-time to maximize the trading results. The paper will reveal the methodology to optimize and implement this model in automated trading systems. After tests, it is proved that this methodology can be applied with good efficiency in different timeframes. Real trading results will be also displayed and analyzed in order to qualify the methodology and to compare it with other models. As a conclusion, it was found that the price prediction model using the price cyclicality function is a reliable trading methodology for algorithmic trading in the financial market.

Keywords: algorithmic trading, automated trading systems, financial markets, high-frequency trading, price prediction

Procedia PDF Downloads 159
880 A Multi-Layer Based Architecture for the Development of an Open Source CAD/CAM Integration Virtual Platform

Authors: Alvaro Aguinaga, Carlos Avila, Edgar Cando

Abstract:

This article proposes a n-layer architecture, with a web client as a front-end, for the development of a virtual platform for process simulation on CNC machines. This Open-Source platform includes a CAD-CAM interface drawing primitives, and then used to furnish a CNC program that triggers a touch-screen virtual simulator. The objectives of this project are twofold. First one is an educational component that fosters new alternatives for the CAD-CAM/CNC learning process in undergrad and grade schools and technical and technological institutes emphasizing in the development of critical skills, discussion and collaborative work. The second objective puts together a research and technological component that will take the state of the art in CAD-CAM integration to a new level with the development of optimal algorithms and virtual platforms, on-line availability, that will pave the way for the long-term goal of this project, that is, to have a visible and active graduate school in Ecuador and a world wide Open-Innovation community in the area of CAD-CAM integration and operation of CNC machinery. The virtual platform, developed as a part of this study: (1) delivers improved training process of students, (2) creates a multidisciplinary team and a collaborative work space that will push the new generation of students to face future technological challenges, (3) implements industry standards for CAD/CAM, (4) presents a platform for the development of industrial applications. A protoype of this system was developed and implemented in a network of universities and technological institutes in Ecuador.

Keywords: CAD-CAM integration, virtual platforms, CNC machines, multi-layer based architecture

Procedia PDF Downloads 397
879 Characteristics and Flight Test Analysis of a Fixed-Wing UAV with Hover Capability

Authors: Ferit Çakıcı, M. Kemal Leblebicioğlu

Abstract:

In this study, characteristics and flight test analysis of a fixed-wing unmanned aerial vehicle (UAV) with hover capability is analyzed. The base platform is chosen as a conventional airplane with throttle, ailerons, elevator and rudder control surfaces, that inherently allows level flight. Then this aircraft is mechanically modified by the integration of vertical propellers as in multi rotors in order to provide hover capability. The aircraft is modeled using basic aerodynamical principles and linear models are constructed utilizing small perturbation theory for trim conditions. Flight characteristics are analyzed by benefiting from linear control theory’s state space approach. Distinctive features of the aircraft are discussed based on analysis results with comparison to conventional aircraft platform types. A hybrid control system is proposed in order to reveal unique flight characteristics. The main approach includes design of different controllers for different modes of operation and a hand-over logic that makes flight in an enlarged flight envelope viable. Simulation tests are performed on mathematical models that verify asserted algorithms. Flight tests conducted in real world revealed the applicability of the proposed methods in exploiting fixed-wing and rotary wing characteristics of the aircraft, which provide agility, survivability and functionality.

Keywords: flight test, flight characteristics, hybrid aircraft, unmanned aerial vehicle

Procedia PDF Downloads 304
878 Social Media Resignation the Only Way to Protect User Data and Restore Cognitive Balance, a Literature Review

Authors: Rajarshi Motilal

Abstract:

The birth of the Internet and the rise of social media marked an important chapter in the history of humankind. Often termed the fourth scientific revolution, the Internet has changed human lives and cognisance. The birth of Web 2.0, followed by the launch of social media and social networking sites, added another milestone to these technological advancements where connectivity and influx of information became dominant. With billions of individuals using the internet and social media sites in the 21st century, “users” became “consumers”, and orthodox marketing reshaped itself to digital marketing. Furthermore, organisations started using sophisticated algorithms to predict consumer purchase behaviour and manipulate it to sustain themselves in such a competitive environment. The rampant storage and analysis of individual data became the new normal, raising many questions about data privacy. The excessive usage of the Internet among individuals brought in other problems of them becoming addicted to it, scavenging for societal approval and instant gratification, subsequently leading to a collective dualism, isolation, and finally, depression. This study aims to determine the relationship between social media usage in the modern age and the rise of psychological and cognitive imbalances in human minds. The literature review is positioned timely as an addition to the existing work at a time when the world is constantly debating on whether social media resignation is the only way to protect user data and restore the decaying cognitive balance.

Keywords: social media, digital marketing, consumer behaviour, internet addiction, data privacy

Procedia PDF Downloads 55
877 Innovation and Employment in Sub-Saharan Africa: Evidence from Uganda Microdata

Authors: Milton Ayoki, Edward Bbaale

Abstract:

This paper analyses the relationship between innovation and employment at firm level with the objective of understanding the contribution of the different innovation strategies in fostering employment growth in Uganda. We use National Innovation Survey (micro-data of 705 Ugandan firms) for the period 2011-2014 and follow closely Harrison et al. (2014) structured approach, and relate employment growth to process innovations and to the growth of sales separately due to innovative and unchanged products. We find positive effects of product innovation on employment at firm level, while process innovation has no discernable impact on employment. Although there is evidence to suggest displacement of labour in some cases where firms only introduce new process, this effect is compensated by growth in employment from new products, which for most firms are introduced simultaneously with new process. Results suggest that source of innovation as well as size of innovating firms or end users of innovation matter for job growth. Innovation that develops from within the firm itself (user) and involving larger firms has greater impact on employment than that developed from outside or coming from within smaller firms. In addition, innovative firms are one and half times more likely to survive in the innovation driven economy environment than those that do not innovate. These results have important implications for policymakers and stakeholders in innovation ecosystem. Supporting policies need to be correctly tailored since the impacts depend on the innovation strategy (type) and characteristics and sector of the innovative firms (small, large, industry, etc.). Policies to spur investment, particularly in innovative sectors and firms with high growth potential would have long lasting effects on job creation. JEL Classification: D24, J0, J20, L20, O30.

Keywords: employment, process innovation, product innovation, Sub-Saharan Africa

Procedia PDF Downloads 142
876 Enhancer: An Effective Transformer Architecture for Single Image Super Resolution

Authors: Pitigalage Chamath Chandira Peiris

Abstract:

A widely researched domain in the field of image processing in recent times has been single image super-resolution, which tries to restore a high-resolution image from a single low-resolution image. Many more single image super-resolution efforts have been completed utilizing equally traditional and deep learning methodologies, as well as a variety of other methodologies. Deep learning-based super-resolution methods, in particular, have received significant interest. As of now, the most advanced image restoration approaches are based on convolutional neural networks; nevertheless, only a few efforts have been performed using Transformers, which have demonstrated excellent performance on high-level vision tasks. The effectiveness of CNN-based algorithms in image super-resolution has been impressive. However, these methods cannot completely capture the non-local features of the data. Enhancer is a simple yet powerful Transformer-based approach for enhancing the resolution of images. A method for single image super-resolution was developed in this study, which utilized an efficient and effective transformer design. This proposed architecture makes use of a locally enhanced window transformer block to alleviate the enormous computational load associated with non-overlapping window-based self-attention. Additionally, it incorporates depth-wise convolution in the feed-forward network to enhance its ability to capture local context. This study is assessed by comparing the results obtained for popular datasets to those obtained by other techniques in the domain.

Keywords: single image super resolution, computer vision, vision transformers, image restoration

Procedia PDF Downloads 82
875 From Colonial Outpost to Cultural India: Folk Epics of India

Authors: Jyoti Brahma

Abstract:

Folk epics of India are found in various Indian languages. The study of folk epics and its importance in folkloristic study in India came into prominence only during the nineteenth century. The British administrators and missionaries collected and documented folk epics from various parts of the country. The paper is an attempt to investigate how colonial outpost appears to penetrate the interiors of Indian land and society and triggered off the Indian Renaissance. It takes into account the compositions of the epics of India and the attention it received during the nineteenth century, which in turn gave, rise to the national consciousness shaping the culture of India. Composed as oral traditions these folk epics are now seen as repositories of historical consciousness whereas in earlier times societies without literacy were said to be without history. So, there is an urgent need to re-examine the British impact on Indian literary traditions. The Bhakti poets through their nuanced responses in their efforts to change the behavior of Indian society gives us the perfect example of deferment in the clear cut distinction between the folk and the classical in the context of India. It evades a pure categorization and classification of the classical and constitutes part of the folk traditions of the cultural heritage of India. Therefore, the ethical question of what is ontologically known as ordinary discourse in the case of the “folk” forms metaphors and folk language gains importance once more. The paper also thus seeks simultaneously to outline the significant factors responsible for shaping the destiny of folklore in South India particularly the four political states of the Indian Union: Andhra Pradesh, Karnataka, Kerala and Tamil Nadu, what could be termed as South Indian “cultural zones”.

Keywords: colonial, folk, folklore, tradition

Procedia PDF Downloads 293
874 Efficient Human Motion Detection Feature Set by Using Local Phase Quantization Method

Authors: Arwa Alzughaibi

Abstract:

Human Motion detection is a challenging task due to a number of factors including variable appearance, posture and a wide range of illumination conditions and background. So, the first need of such a model is a reliable feature set that can discriminate between a human and a non-human form with a fair amount of confidence even under difficult conditions. By having richer representations, the classification task becomes easier and improved results can be achieved. The Aim of this paper is to investigate the reliable and accurate human motion detection models that are able to detect the human motions accurately under varying illumination levels and backgrounds. Different sets of features are tried and tested including Histogram of Oriented Gradients (HOG), Deformable Parts Model (DPM), Local Decorrelated Channel Feature (LDCF) and Aggregate Channel Feature (ACF). However, we propose an efficient and reliable human motion detection approach by combining Histogram of oriented gradients (HOG) and local phase quantization (LPQ) as the feature set, and implementing search pruning algorithm based on optical flow to reduce the number of false positive. Experimental results show the effectiveness of combining local phase quantization descriptor and the histogram of gradient to perform perfectly well for a large range of illumination conditions and backgrounds than the state-of-the-art human detectors. Areaunder th ROC Curve (AUC) of the proposed method achieved 0.781 for UCF dataset and 0.826 for CDW dataset which indicates that it performs comparably better than HOG, DPM, LDCF and ACF methods.

Keywords: human motion detection, histograms of oriented gradient, local phase quantization, local phase quantization

Procedia PDF Downloads 234
873 Arithmetic Operations Based on Double Base Number Systems

Authors: K. Sanjayani, C. Saraswathy, S. Sreenivasan, S. Sudhahar, D. Suganya, K. S. Neelukumari, N. Vijayarangan

Abstract:

Double Base Number System (DBNS) is an imminent system of representing a number using two bases namely 2 and 3, which has its application in Elliptic Curve Cryptography (ECC) and Digital Signature Algorithm (DSA).The previous binary method representation included only base 2. DBNS uses an approximation algorithm namely, Greedy Algorithm. By using this algorithm, the number of digits required to represent a larger number is less when compared to the standard binary method that uses base 2 algorithms. Hence, the computational speed is increased and time being reduced. The standard binary method uses binary digits 0 and 1 to represent a number whereas the DBNS method uses binary digit 1 alone to represent any number (canonical form). The greedy algorithm uses two ways to represent the number, one is by using only the positive summands and the other is by using both positive and negative summands. In this paper, arithmetic operations are used for elliptic curve cryptography. Elliptic curve discrete logarithm problem is the foundation for most of the day to day elliptic curve cryptography. This appears to be a momentous hard slog compared to digital logarithm problem. In elliptic curve digital signature algorithm, the key generation requires 160 bit of data by usage of standard binary representation. Whereas, the number of bits required generating the key can be reduced with the help of double base number representation. In this paper, a new technique is proposed to generate key during encryption and extraction of key in decryption.

Keywords: cryptography, double base number system, elliptic curve cryptography, elliptic curve digital signature algorithm

Procedia PDF Downloads 375
872 Control of Base Isolated Benchmark using Combined Control Strategy with Fuzzy Algorithm Subjected to Near-Field Earthquakes

Authors: Hashem Shariatmadar, Mozhgansadat Momtazdargahi

Abstract:

The purpose of control structure against earthquake is to dissipate earthquake input energy to the structure and reduce the plastic deformation of structural members. There are different methods for control structure against earthquake to reduce the structure response that they are active, semi-active, inactive and hybrid. In this paper two different combined control systems are used first system comprises base isolator and multi tuned mass dampers (BI & MTMD) and another combination is hybrid base isolator and multi tuned mass dampers (HBI & MTMD) for controlling an eight story isolated benchmark steel structure. Active control force of hybrid isolator is estimated by fuzzy logic algorithms. The influences of the combined systems on the responses of the benchmark structure under the two near-field earthquake (Newhall & Elcentro) are evaluated by nonlinear dynamic time history analysis. Applications of combined control systems consisting of passive or active systems installed in parallel to base-isolation bearings have the capability of reducing response quantities of base-isolated (relative and absolute displacement) structures significantly. Therefore in design and control of irregular isolated structures using the proposed control systems, structural demands (relative and absolute displacement and etc.) in each direction must be considered separately.

Keywords: base-isolated benchmark structure, multi-tuned mass dampers, hybrid isolators, near-field earthquake, fuzzy algorithm

Procedia PDF Downloads 276
871 An Application of Integrated Multi-Objective Particles Swarm Optimization and Genetic Algorithm Metaheuristic through Fuzzy Logic for Optimization of Vehicle Routing Problems in Sugar Industry

Authors: Mukhtiar Singh, Sumeet Nagar

Abstract:

Vehicle routing problem (VRP) is a combinatorial optimization and nonlinear programming problem aiming to optimize decisions regarding given set of routes for a fleet of vehicles in order to provide cost-effective and efficient delivery of both services and goods to the intended customers. This paper proposes the application of integrated particle swarm optimization (PSO) and genetic optimization algorithm (GA) to address the Vehicle routing problem in sugarcane industry in India. Suger industry is very prominent agro-based industry in India due to its impacts on rural livelihood and estimated to be employing around 5 lakhs workers directly in sugar mills. Due to various inadequacies, inefficiencies and inappropriateness associated with the current vehicle routing model it costs huge money loss to the industry which needs to be addressed in proper context. The proposed algorithm utilizes the crossover operation that originally appears in genetic algorithm (GA) to improve its flexibility and manipulation more readily and avoid being trapped in local optimum, and simultaneously for improving the convergence speed of the algorithm, level set theory is also added to it. We employ the hybrid approach to an example of VRP and compare its result with those generated by PSO, GA, and parallel PSO algorithms. The experimental comparison results indicate that the performance of hybrid algorithm is superior to others, and it will become an effective approach for solving discrete combinatory problems.

Keywords: fuzzy logic, genetic algorithm, particle swarm optimization, vehicle routing problem

Procedia PDF Downloads 371
870 Analysis of the Unmanned Aerial Vehicles’ Incidents and Accidents: The Role of Human Factors

Authors: Jacob J. Shila, Xiaoyu O. Wu

Abstract:

As the applications of unmanned aerial vehicles (UAV) continue to increase across the world, it is critical to understand the factors that contribute to incidents and accidents associated with these systems. Given the variety of daily applications that could utilize the operations of the UAV (e.g., medical, security operations, construction activities, landscape activities), the main discussion has been how to safely incorporate the UAV into the national airspace system. The types of UAV incidents being reported range from near sightings by other pilots to actual collisions with aircraft or UAV. These incidents have the potential to impact the rest of aviation operations in a variety of ways, including human lives, liability costs, and delay costs. One of the largest causes of these incidents cited is the human factor; other causes cited include maintenance, aircraft, and others. This work investigates the key human factors associated with UAV incidents. To that end, the data related to UAV incidents that have occurred in the United States is both reviewed and analyzed to identify key human factors related to UAV incidents. The data utilized in this work is gathered from the Federal Aviation Administration (FAA) drone database. This study adopts the human factor analysis and classification system (HFACS) to identify key human factors that have contributed to some of the UAV failures to date. The uniqueness of this work is the incorporation of UAV incident data from a variety of applications and not just military data. In addition, identifying the specific human factors is crucial towards developing safety operational models and human factor guidelines for the UAV. The findings of these common human factors are also compared to similar studies in other countries to determine whether these factors are common internationally.

Keywords: human factors, incidents and accidents, safety, UAS, UAV

Procedia PDF Downloads 219
869 Image Recognition Performance Benchmarking for Edge Computing Using Small Visual Processing Unit

Authors: Kasidis Chomrat, Nopasit Chakpitak, Anukul Tamprasirt, Annop Thananchana

Abstract:

Internet of Things devices or IoT and Edge Computing has become one of the biggest things happening in innovations and one of the most discussed of the potential to improve and disrupt traditional business and industry alike. With rises of new hang cliff challenges like COVID-19 pandemic that posed a danger to workforce and business process of the system. Along with drastically changing landscape in business that left ruined aftermath of global COVID-19 pandemic, looming with the threat of global energy crisis, global warming, more heating global politic that posed a threat to become new Cold War. How emerging technology like edge computing and usage of specialized design visual processing units will be great opportunities for business. The literature reviewed on how the internet of things and disruptive wave will affect business, which explains is how all these new events is an effect on the current business and how would the business need to be adapting to change in the market and world, and example test benchmarking for consumer marketed of newer devices like the internet of things devices equipped with new edge computing devices will be increase efficiency and reducing posing a risk from a current and looming crisis. Throughout the whole paper, we will explain the technologies that lead the present technologies and the current situation why these technologies will be innovations that change the traditional practice through brief introductions to the technologies such as cloud computing, edge computing, Internet of Things and how it will be leading into future.

Keywords: internet of things, edge computing, machine learning, pattern recognition, image classification

Procedia PDF Downloads 129
868 Big Data in Telecom Industry: Effective Predictive Techniques on Call Detail Records

Authors: Sara ElElimy, Samir Moustafa

Abstract:

Mobile network operators start to face many challenges in the digital era, especially with high demands from customers. Since mobile network operators are considered a source of big data, traditional techniques are not effective with new era of big data, Internet of things (IoT) and 5G; as a result, handling effectively different big datasets becomes a vital task for operators with the continuous growth of data and moving from long term evolution (LTE) to 5G. So, there is an urgent need for effective Big data analytics to predict future demands, traffic, and network performance to full fill the requirements of the fifth generation of mobile network technology. In this paper, we introduce data science techniques using machine learning and deep learning algorithms: the autoregressive integrated moving average (ARIMA), Bayesian-based curve fitting, and recurrent neural network (RNN) are employed for a data-driven application to mobile network operators. The main framework included in models are identification parameters of each model, estimation, prediction, and final data-driven application of this prediction from business and network performance applications. These models are applied to Telecom Italia Big Data challenge call detail records (CDRs) datasets. The performance of these models is found out using a specific well-known evaluation criteria shows that ARIMA (machine learning-based model) is more accurate as a predictive model in such a dataset than the RNN (deep learning model).

Keywords: big data analytics, machine learning, CDRs, 5G

Procedia PDF Downloads 117
867 Artificial Intelligence in Melanoma Prognosis: A Narrative Review

Authors: Shohreh Ghasemi

Abstract:

Introduction: Melanoma is a complex disease with various clinical and histopathological features that impact prognosis and treatment decisions. Traditional methods of melanoma prognosis involve manual examination and interpretation of clinical and histopathological data by dermatologists and pathologists. However, the subjective nature of these assessments can lead to inter-observer variability and suboptimal prognostic accuracy. AI, with its ability to analyze vast amounts of data and identify patterns, has emerged as a promising tool for improving melanoma prognosis. Methods: A comprehensive literature search was conducted to identify studies that employed AI techniques for melanoma prognosis. The search included databases such as PubMed and Google Scholar, using keywords such as "artificial intelligence," "melanoma," and "prognosis." Studies published between 2010 and 2022 were considered. The selected articles were critically reviewed, and relevant information was extracted. Results: The review identified various AI methodologies utilized in melanoma prognosis, including machine learning algorithms, deep learning techniques, and computer vision. These techniques have been applied to diverse data sources, such as clinical images, dermoscopy images, histopathological slides, and genetic data. Studies have demonstrated the potential of AI in accurately predicting melanoma prognosis, including survival outcomes, recurrence risk, and response to therapy. AI-based prognostic models have shown comparable or even superior performance compared to traditional methods.

Keywords: artificial intelligence, melanoma, accuracy, prognosis prediction, image analysis, personalized medicine

Procedia PDF Downloads 53
866 An Approach for Association Rules Ranking

Authors: Rihab Idoudi, Karim Saheb Ettabaa, Basel Solaiman, Kamel Hamrouni

Abstract:

Medical association rules induction is used to discover useful correlations between pertinent concepts from large medical databases. Nevertheless, ARs algorithms produce huge amount of delivered rules and do not guarantee the usefulness and interestingness of the generated knowledge. To overcome this drawback, we propose an ontology based interestingness measure for ARs ranking. According to domain expert, the goal of the use of ARs is to discover implicit relationships between items of different categories such as ‘clinical features and disorders’, ‘clinical features and radiological observations’, etc. That’s to say, the itemsets which are composed of ‘similar’ items are uninteresting. Therefore, the dissimilarity between the rule’s items can be used to judge the interestingness of association rules; the more different are the items, the more interesting the rule is. In this paper, we design a distinct approach for ranking semantically interesting association rules involving the use of an ontology knowledge mining approach. The basic idea is to organize the ontology’s concepts into a hierarchical structure of conceptual clusters of targeted subjects, where each cluster encapsulates ‘similar’ concepts suggesting a specific category of the domain knowledge. The interestingness of association rules is, then, defined as the dissimilarity between corresponding clusters. That is to say, the further are the clusters of the items in the AR, the more interesting the rule is. We apply the method in our domain of interest – mammographic domain- using an existing mammographic ontology called Mammo with the goal of deriving interesting rules from past experiences, to discover implicit relationships between concepts modeling the domain.

Keywords: association rule, conceptual clusters, interestingness measures, ontology knowledge mining, ranking

Procedia PDF Downloads 301
865 Design of an Improved Distributed Framework for Intrusion Detection System Based on Artificial Immune System and Neural Network

Authors: Yulin Rao, Zhixuan Li, Burra Venkata Durga Kumar

Abstract:

Intrusion detection refers to monitoring the actions of internal and external intruders on the system and detecting the behaviours that violate security policies in real-time. In intrusion detection, there has been much discussion about the application of neural network technology and artificial immune system (AIS). However, many solutions use static methods (signature-based and stateful protocol analysis) or centralized intrusion detection systems (CIDS), which are unsuitable for real-time intrusion detection systems that need to process large amounts of data and detect unknown intrusions. This article proposes a framework for a distributed intrusion detection system (DIDS) with multi-agents based on the concept of AIS and neural network technology to detect anomalies and intrusions. In this framework, multiple agents are assigned to each host and work together, improving the system's detection efficiency and robustness. The trainer agent in the central server of the framework uses the artificial neural network (ANN) rather than the negative selection algorithm of AIS to generate mature detectors. Mature detectors can distinguish between self-files and non-self-files after learning. Our analyzer agents use genetic algorithms to generate memory cell detectors. This kind of detector will effectively reduce false positive and false negative errors and act quickly on known intrusions.

Keywords: artificial immune system, distributed artificial intelligence, multi-agent, intrusion detection system, neural network

Procedia PDF Downloads 88