Search results for: processing schemes
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4168

Search results for: processing schemes

3148 Optimization of Strategies and Models Review for Optimal Technologies-Based on Fuzzy Schemes for Green Architecture

Authors: Ghada Elshafei, A. Elazim Negm

Abstract:

Recently, Green architecture becomes a significant way to a sustainable future. Green building designs involve finding the balance between comfortable homebuilding and sustainable environment. Moreover, the utilization of the new technologies such as artificial intelligence techniques are used to complement current practices in creating greener structures to keep the built environment more sustainable. The most common objectives are green buildings should be designed to minimize the overall impact of the built environment on ecosystems in general and particularly on human health and on the natural environment. This will lead to protecting occupant health, improving employee productivity, reducing pollution and sustaining the environmental. In green building design, multiple parameters which may be interrelated, contradicting, vague and of qualitative/quantitative nature are broaden to use. This paper presents a comprehensive critical state of art review of current practices based on fuzzy and its combination techniques. Also, presented how green architecture/building can be improved using the technologies that been used for analysis to seek optimal green solutions strategies and models to assist in making the best possible decision out of different alternatives.

Keywords: green architecture/building, technologies, optimization, strategies, fuzzy techniques, models

Procedia PDF Downloads 457
3147 Unstructured-Data Content Search Based on Optimized EEG Signal Processing and Multi-Objective Feature Extraction

Authors: Qais M. Yousef, Yasmeen A. Alshaer

Abstract:

Over the last few years, the amount of data available on the globe has been increased rapidly. This came up with the emergence of recent concepts, such as the big data and the Internet of Things, which have furnished a suitable solution for the availability of data all over the world. However, managing this massive amount of data remains a challenge due to their large verity of types and distribution. Therefore, locating the required file particularly from the first trial turned to be a not easy task, due to the large similarities of names for different files distributed on the web. Consequently, the accuracy and speed of search have been negatively affected. This work presents a method using Electroencephalography signals to locate the files based on their contents. Giving the concept of natural mind waves processing, this work analyses the mind wave signals of different people, analyzing them and extracting their most appropriate features using multi-objective metaheuristic algorithm, and then classifying them using artificial neural network to distinguish among files with similar names. The aim of this work is to provide the ability to find the files based on their contents using human thoughts only. Implementing this approach and testing it on real people proved its ability to find the desired files accurately within noticeably shorter time and retrieve them as a first choice for the user.

Keywords: artificial intelligence, data contents search, human active memory, mind wave, multi-objective optimization

Procedia PDF Downloads 164
3146 Economic Growth through Quality in Higher Education

Authors: Mohammad Mushir Khan, C. Satyanarayana

Abstract:

Education is considered as one of the prime bottlenecks in the economic growth of India. The Ministry of Human Resource & Development, Government of India has, therefore, given special attention to this issue and the Gross Enrollment Ratio (GER) in Higher Education has increased marginally during last five years, with the efforts and various policy decisions like Right to Education (RTE) and other fee reimbursement schemes, initiated by the State Governments. But still this is one of the lowest, if assessed at the global level. It is true that the GER has improved but the survey reveals that the quality has been badly affected. This paper tries to assess the impact of lack of quality education in various sectors that affects Indian Economy and thereby signifies the need of immediate policy decision at the government level. It is to be noted that in higher education, science, management, engineering and technology plays vital role as far as shaping country’s economy is concerned and as such the quality needs to be addressed, particularly, in these streams. The paper, after carefully studying lots of survey reports and other government/ non-government documents recommends measures to be initiated by the Central Government, on priority, for improving quality of education. The quality up-gradation in higher education single handedly provides real fuel to the India’s growth Engine, as it has potential to touch each and every sector that strengthens country’s economy.

Keywords: higher education, economy, accreditation, industry, technology

Procedia PDF Downloads 412
3145 Automatic Early Breast Cancer Segmentation Enhancement by Image Analysis and Hough Transform

Authors: David Jurado, Carlos Ávila

Abstract:

Detection of early signs of breast cancer development is crucial to quickly diagnose the disease and to define adequate treatment to increase the survival probability of the patient. Computer Aided Detection systems (CADs), along with modern data techniques such as Machine Learning (ML) and Neural Networks (NN), have shown an overall improvement in digital mammography cancer diagnosis, reducing the false positive and false negative rates becoming important tools for the diagnostic evaluations performed by specialized radiologists. However, ML and NN-based algorithms rely on datasets that might bring issues to the segmentation tasks. In the present work, an automatic segmentation and detection algorithm is described. This algorithm uses image processing techniques along with the Hough transform to automatically identify microcalcifications that are highly correlated with breast cancer development in the early stages. Along with image processing, automatic segmentation of high-contrast objects is done using edge extraction and circle Hough transform. This provides the geometrical features needed for an automatic mask design which extracts statistical features of the regions of interest. The results shown in this study prove the potential of this tool for further diagnostics and classification of mammographic images due to the low sensitivity to noisy images and low contrast mammographies.

Keywords: breast cancer, segmentation, X-ray imaging, hough transform, image analysis

Procedia PDF Downloads 66
3144 The Value of Online News: Addressing the Problem of Online Investment Fraud Crimes in Thailand

Authors: Thapthep Paprach, Benya Lertsuwan

Abstract:

Investment fraud is not a new criminal, but there are still more victims during the Internet of Things era. This kind of criminal has been classified as a national and transnational financial crime problem all over the world. In Thailand, the country has also been attacked by this kind of crime. This research concerns whether the mass media that is supposed to cover news about online investment scams realized and warned Thais about this crime. Thus, this study explores the value of news about investment fraud in terms of frequency. The methodology uses web crawling from the top 5 news agency websites that have the most access. We pull out all information reporting about investment fraud. The findings revealed that the ‘Khaosod’ news agency was the first rank in reporting on investment crime. On the other hand, ‘Matichon’ was the least reported. Thairat news agencies frequently reported such criminals from midnight to very early in the morning, while other news agencies reported during the daytime. The results between the frequency of news reporting about investment fraud and the monthly number of victim reports are not correlated. Although the most cases reported to Thai police were in February 2023, but the most news reported was in January 2023. In conclusion, there might be a negative correlation between the amount of investment fraud news reported and the number of victims.

Keywords: investment fraud, news value, online news report, Ponzi schemes, Romance scam

Procedia PDF Downloads 63
3143 Contribution of Remote Sensing and GIS to the Study of the Impact of the Salinity of Sebkhas on the Quality of Groundwater: Case of Sebkhet Halk El Menjel (Sousse)

Authors: Gannouni Sonia, Hammami Asma, Saidi Salwa, Rebai Noamen

Abstract:

Water resources in Tunisia have experienced quantitative and qualitative degradation, especially when talking about wetlands and Sbekhas. Indeed, the objective of this work is to study the spatio-temporal evolution of salinity for 29 years (from 1987 to 2016). A study of the connection between surface water and groundwater is necessary to know the degree of influence of the Sebkha brines on the water table. The evolution of surface salinity is determined by remote sensing based on Landsat TM and OLI/TIRS satellite images of the years 1987, 2007, 2010, and 2016. The processing of these images allowed us to determine the NDVI(Normalized Difference Vegetation Index), the salinity index, and the surface temperature around Sebkha. In addition, through a geographic information system(GIS), we could establish a map of the distribution of salinity in the subsurface of the water table of Chott Mariem and Hergla/SidiBouAli/Kondar. The results of image processing and the calculation of the index and surface temperature show an increase in salinity downstream of in addition to the sebkha and the development of vegetation cover upstream and the western part of the sebkha. This richness may be due both to contamination by seawater infiltration from the barrier beach of Hergla as well as the passage of groundwater to the sebkha.

Keywords: spatio-temporal monitoring, salinity, satellite images, NDVI, sebkha

Procedia PDF Downloads 116
3142 A Unified Approach for Naval Telecommunication Architectures

Authors: Y. Lacroix, J.-F. Malbranque

Abstract:

We present a chronological evolution for naval telecommunication networks. We distinguish periods: with or without multiplexers, with switch systems, with federative systems, with medium switching, and with medium switching with wireless networks. This highlights the introduction of new layers and technology in the architecture. These architectures are presented using layer models of transmission, in a unified way, which enables us to integrate pre-existing models. A ship of a naval fleet has internal communications (i.e. applications' networks of the edge) and external communications (i.e. the use of the means of transmission between edges). We propose architectures, deduced from the layer model, which are the point of convergence between the networks on board and the HF, UHF radio, and satellite resources. This modelling allows to consider end-to-end naval communications, and in a more global way, that is from the user on board towards the user on shore, including transmission and networks on the shore side. The new architectures need take care of quality of services for end-to-end communications, the more remote control develops a lot and will do so in the future. Naval telecommunications will be more and more complex and will use more and more advanced technologies, it will thus be necessary to establish clear global communication schemes to grant consistency of the architectures. Our latest model has been implemented in a military naval situation, and serves as the basic architecture for the RIFAN2 network.

Keywords: equilibrium beach profile, eastern tombolo of Giens, potential function, erosion

Procedia PDF Downloads 281
3141 Alphabet Recognition Using Pixel Probability Distribution

Authors: Vaidehi Murarka, Sneha Mehta, Dishant Upadhyay

Abstract:

Our project topic is “Alphabet Recognition using pixel probability distribution”. The project uses techniques of Image Processing and Machine Learning in Computer Vision. Alphabet recognition is the mechanical or electronic translation of scanned images of handwritten, typewritten or printed text into machine-encoded text. It is widely used to convert books and documents into electronic files etc. Alphabet Recognition based OCR application is sometimes used in signature recognition which is used in bank and other high security buildings. One of the popular mobile applications includes reading a visiting card and directly storing it to the contacts. OCR's are known to be used in radar systems for reading speeders license plates and lots of other things. The implementation of our project has been done using Visual Studio and Open CV (Open Source Computer Vision). Our algorithm is based on Neural Networks (machine learning). The project was implemented in three modules: (1) Training: This module aims “Database Generation”. Database was generated using two methods: (a) Run-time generation included database generation at compilation time using inbuilt fonts of OpenCV library. Human intervention is not necessary for generating this database. (b) Contour–detection: ‘jpeg’ template containing different fonts of an alphabet is converted to the weighted matrix using specialized functions (contour detection and blob detection) of OpenCV. The main advantage of this type of database generation is that the algorithm becomes self-learning and the final database requires little memory to be stored (119kb precisely). (2) Preprocessing: Input image is pre-processed using image processing concepts such as adaptive thresholding, binarizing, dilating etc. and is made ready for segmentation. “Segmentation” includes extraction of lines, words, and letters from the processed text image. (3) Testing and prediction: The extracted letters are classified and predicted using the neural networks algorithm. The algorithm recognizes an alphabet based on certain mathematical parameters calculated using the database and weight matrix of the segmented image.

Keywords: contour-detection, neural networks, pre-processing, recognition coefficient, runtime-template generation, segmentation, weight matrix

Procedia PDF Downloads 373
3140 Fault Detection and Diagnosis of Broken Bar Problem in Induction Motors Base Wavelet Analysis and EMD Method: Case Study of Mobarakeh Steel Company in Iran

Authors: M. Ahmadi, M. Kafil, H. Ebrahimi

Abstract:

Nowadays, induction motors have a significant role in industries. Condition monitoring (CM) of this equipment has gained a remarkable importance during recent years due to huge production losses, substantial imposed costs and increases in vulnerability, risk, and uncertainty levels. Motor current signature analysis (MCSA) is one of the most important techniques in CM. This method can be used for rotor broken bars detection. Signal processing methods such as Fast Fourier transformation (FFT), Wavelet transformation and Empirical Mode Decomposition (EMD) are used for analyzing MCSA output data. In this study, these signal processing methods are used for broken bar problem detection of Mobarakeh steel company induction motors. Based on wavelet transformation method, an index for fault detection, CF, is introduced which is the variation of maximum to the mean of wavelet transformation coefficients. We find that, in the broken bar condition, the amount of CF factor is greater than the healthy condition. Based on EMD method, the energy of intrinsic mode functions (IMF) is calculated and finds that when motor bars become broken the energy of IMFs increases.

Keywords: broken bar, condition monitoring, diagnostics, empirical mode decomposition, fourier transform, wavelet transform

Procedia PDF Downloads 140
3139 Modernization of Garri-Frying Technologies with Respect to Women Anthromophic Quality in Nigeria

Authors: Adegbite Bashiru Adeniyi, Olaniyi Akeem Olawale, Ayobamidele Sinatu Juliet

Abstract:

The study was carried out in the 6 South Western states of Nigeria to analyze socio-economic characteristic of garri processors and their anthropometric qualities with respect to modern technologies used in garri processing. About 20 respondents were randomly selected from each of the 6 workstations purposively considered for the study due to their daily processing activities already attracted high patronage of customers. These include Oguntolu village (Ogun State), Igoba-Akure (Ondo State), Imo-Ilesa (Osun State), Odo Oba-Ileri (Oyo State), Irasa village (Ekiti State) and Epe in Lagos state. Interview schedule was conducted for 120 respondents to elicit information. Data were analyzed using descriptive statistical tools. It was observed from the findings that respondents were in their most productive age range (36-45 years) except Ogun state where majority (45%) were relatively older than 45 years. A fewer processors were much younger than 26 years old. It furthers revealed that not less than 55% have body weight greater than 50.0 kilogram, also not less than 70% were taller than 1.5 meter. So also, the hand length and hand thickness of the majority were long and bulky which are considered suitable for operating some modern and improved technologies in garri-frying process. This information could be used by various technological developers to enhance production of modern equipment and tools for a greater efficiency.

Keywords: agro-business, anthromorphic, modernization, proficiency

Procedia PDF Downloads 496
3138 Instant Location Detection of Objects Moving at High Speed in C-OTDR Monitoring Systems

Authors: Andrey V. Timofeev

Abstract:

The practical efficient approach is suggested to estimate the high-speed objects instant bounds in C-OTDR monitoring systems. In case of super-dynamic objects (trains, cars) is difficult to obtain the adequate estimate of the instantaneous object localization because of estimation lag. In other words, reliable estimation coordinates of monitored object requires taking some time for data observation collection by means of C-OTDR system, and only if the required sample volume will be collected the final decision could be issued. But it is contrary to requirements of many real applications. For example, in rail traffic management systems we need to get data off the dynamic objects localization in real time. The way to solve this problem is to use the set of statistical independent parameters of C-OTDR signals for obtaining the most reliable solution in real time. The parameters of this type we can call as 'signaling parameters' (SP). There are several the SP’s which carry information about dynamic objects instant localization for each of C-OTDR channels. The problem is that some of these parameters are very sensitive to dynamics of seismoacoustic emission sources but are non-stable. On the other hand, in case the SP is very stable it becomes insensitive as a rule. This report contains describing the method for SP’s co-processing which is designed to get the most effective dynamic objects localization estimates in the C-OTDR monitoring system framework.

Keywords: C-OTDR-system, co-processing of signaling parameters, high-speed objects localization, multichannel monitoring systems

Procedia PDF Downloads 456
3137 Drivers of Farmers' Contract Compliance Behaviour: Evidence from a Case Study of Dangote Tomato Processing Plant in Northern Nigeria.

Authors: Umar Shehu Umar

Abstract:

Contract farming is a viable strategy agribusinesses rely on to strengthen vertical coordination. However, low contract compliance remains a significant setback to agribusinesses' contract performance. The present study aims to understand what drives smallholder farmers’ contract compliance behaviour. Qualitative information was collected through Focus Group Discussions to enrich the design of the survey questionnaire administered on a sample of 300 randomly selected farmers contracted by the Dangote Tomato Processing Plant (DTPP) in four regions of northern Nigeria. Novel transaction level data of tomato sales covering one season were collected in addition to socio-economic information of the sampled farmers. Binary logistic model results revealed that open fresh market tomato prices and payment delays negatively affect farmers' compliance behaviour while quantity harvested, education level and input provision correlated positively with compliance. The study suggests that contract compliance will increase if contracting firms devise a reliable and timely payment plan (e.g., digital payment), continue input and service provisions (e.g., improved seeds, extension services) and incentives (e.g., loyalty rewards, bonuses) in the contract.

Keywords: contract farming, compliance, farmers and processors., smallholder

Procedia PDF Downloads 36
3136 Tracking and Classifying Client Interactions with Personal Coaches

Authors: Kartik Thakore, Anna-Roza Tamas, Adam Cole

Abstract:

The world health organization (WHO) reports that by 2030 more than 23.7 million deaths annually will be caused by Cardiovascular Diseases (CVDs); with a 2008 economic impact of $3.76 T. Metabolic syndrome is a disorder of multiple metabolic risk factors strongly indicated in the development of cardiovascular diseases. Guided lifestyle intervention driven by live coaching has been shown to have a positive impact on metabolic risk factors. Individuals’ path to improved (decreased) metabolic risk factors are driven by personal motivation and personalized messages delivered by coaches and augmented by technology. Using interactions captured between 400 individuals and 3 coaches over a program period of 500 days, a preliminary model was designed. A novel real time event tracking system was created to track and classify clients based on their genetic profile, baseline questionnaires and usage of a mobile application with live coaching sessions. Classification of clients and coaches was done using a support vector machines application build on Apache Spark, Stanford Natural Language Processing Library (SNLPL) and decision-modeling.

Keywords: guided lifestyle intervention, metabolic risk factors, personal coaching, support vector machines application, Apache Spark, natural language processing

Procedia PDF Downloads 421
3135 Evaluation of Different Cowpea Genotypes Using Grain Yield and Canning Quality Traits

Authors: Magdeline Pakeng Mohlala, R. L. Molatudi, M. A. Mofokeng

Abstract:

Cowpea (Vigna unguiculata (L.) Walp) is an important annual leguminous crop in semi-arid and tropics. Most of cowpea grain production in South Africa is mainly used for domestic consumption, as seed planting and little or none gets to be used in industrial processing; thus, there is a need to expand the utilization of cowpea through industrial processing. Agronomic traits contribute to the understanding of the association between yield and its component traits to facilitate effective selection for yield improvement. The aim of this study was to evaluate cowpea genotypes using grain yield and canning quality traits. The field experiment was conducted in two locations in Limpopo Province, namely Syferkuil Agricultural Experimental farm and Ga-Molepo village during 2017/2018 growing season and canning took place at ARC-Grain Crops Potchefstroom. The experiment comprised of 100 cowpea genotypes laid out in a Randomized Complete Block Designs (RCBD). The grain yield, yield components, and canning quality traits were analysed using Genstat software. About 62 genotypes were suitable for canning, 38 were not due to their seed coat texture, and water uptake was less than 80% resulting in too soft (mushy) seeds. Grain yield for RV115, 99k-494-6, ITOOK1263, RV111, RV353 and 53 other genotypes recorded high positive association with number of branches, pods per plant, and number of seeds per pod, unshelled weight and shelled weight for Syferkuil than at Ga-Molepo are therefore recommended for canning quality.

Keywords: agronomic traits, canning quality, genotypes, yield

Procedia PDF Downloads 136
3134 Socialist Ideology in Africa: A Comparative Study of Pre and Post Socialism

Authors: Haymanot Gebre-Amlak, Selamawit Gebre-Amlak

Abstract:

Since its original declaration in the 18th century, Karl Marx and Friedrich Engels's Communist Manifesto has become one of the most influential political tracts. Socialism is a political path that leads towards communism by fostering a cooperative economy through the creation of cooperative enterprises, common ownership, state ownership, or shared equity. The ultimate objective of communism is to bring everyone working toward the same collective goal of a healthy, happy, and free society. The European establishment of 19-century colonial rule over the continent of Africa reinforced inflows of European investment and forced a profound change in the operation of labor and land markets. The colonial era and forced labor schemes in Africa lasted for several decades. When exiting from colonialism, these African countries were attracted to socialism’s ideology to bridge the social gap and freedom to their society. In this paper, we compare a pre and post socialist ideology and the impact in various African countries. We analyse the different aspects, which led to inconsistent outcomes. Our finding indicates that while they have some facets in common, each African country had a unique interpretation and influence from the socialist ideology.

Keywords: African politics, socialism in Africa, African history, Africa

Procedia PDF Downloads 155
3133 Genetic Evaluation of Locally Flock Sheep in Gabaraka Village

Authors: Salim Omar Raoof

Abstract:

This study was conducted in a private local sheep herd at Gabaraka village-Kirkuk-Iraq. Analysis of 77 ewes recorded and 7 Rams of local sheep presented in Gabaraka village farm plain, the age of ewes ranged between (2-4) years. The aim of this study is to investigate the genetic and non-genetic factors (type of birth, sex, and age of dam) affecting daily milk yield (DMY), birth weight (BW), weaning weight (WW) and Gain characteristics of local sheep raised under Iraq conditions, and it also aims at estimating heritability’s, BLUP. The overall mean of daily milk yield, (BW), (WW), and gain. Was 444.15gm,4.92kg,43.08kg, and 38.16kg, respectively. The results showed there was a significant effect of the type of birth and sex on (BW) and (WW). Also, the age of the dam had a significant effect on daily milk yield (BW), (WW), and gain. Generally, the estimate of heritability of DMP, BWT, WWT, and Gain tend to be 0.22, 0.17, 0.27, and 0.22, respectively. The breeding value (BLUP) for rams ranged between (-0.1684 to 0.188), (-0.205 to 0.310), and ( -0.0171 to 0.029) according to growth traits of Lambs BW, WW, and Gain, respectively. It concluded that the selection of ewes and rams at the population level in planned selection schemes is based on BLUP value and heritability.

Keywords: locally sheep, milk yield, Genetic parameters, BLUP value

Procedia PDF Downloads 65
3132 Restoration of Digital Design Using Row and Column Major Parsing Technique from the Old/Used Jacquard Punched Cards

Authors: R. Kumaravelu, S. Poornima, Sunil Kumar Kashyap

Abstract:

The optimized and digitalized restoration of the information from the old and used manual jacquard punched card in textile industry is referred to as Jacquard Punch Card (JPC) reader. In this paper, we present a novel design and development of photo electronics based system for reading old and used punched cards and storing its binary information for transforming them into an effective image file format. In our textile industry the jacquard punched cards holes diameters having the sizes of 3mm, 5mm and 5.5mm pitch. Before the adaptation of computing systems in the field of textile industry those punched cards were prepared manually without digital design source, but those punched cards are having rich woven designs. Now, the idea is to retrieve binary information from the jacquard punched cards and store them in digital (Non-Graphics) format before processing it. After processing the digital format (Non-Graphics) it is converted into an effective image file format through either by Row major or Column major parsing technique.To accomplish these activities, an embedded system based device and software integration is developed. As part of the test and trial activity the device was tested and installed for industrial service at Weavers Service Centre, Kanchipuram, Tamilnadu in India.

Keywords: file system, SPI. UART, ARM controller, jacquard, punched card, photo LED, photo diode

Procedia PDF Downloads 154
3131 Identification of Lipo-Alkaloids and Fatty Acids in Aconitum carmichaelii Using Liquid Chromatography–Mass Spectrometry and Gas Chromatography–Mass Spectrometry

Authors: Ying Liang, Na Li

Abstract:

Lipo-alkaloid is a kind of C19-norditerpenoid alkaloids existed in Aconitum species, which usually contains an aconitane skeleton and one or two fatty acid residues. The structures are very similar to that of diester-type alkaloids, which are considered as the main bioactive components in Aconitum carmichaelii. They have anti-inflammatory, anti-nociceptive, and anti-proliferative activities. So far, more than 200 lipo-alkaloids were reported from plants, semisynthesis, and biotransformations. In our research, by the combination of ultra-high performance liquid chromatography-quadruple-time of flight mass spectrometry (UHPLC-Q-TOF-MS) and an in-house database, 148 lipo-alkaloids were identified from A. carmichaelii, including 93 potential new compounds and 38 compounds with oxygenated fatty acid moieties. To our knowledge, this is the first time of the reporting of the oxygenated fatty acids as the side chains in naturally-occurring lipo-alkaloids. Considering the fatty acid residues in lipo-alkaloids should come from the free acids in the plant, the fatty acids and their relationship with lipo-alkaloids were further investigated by GC-MS and LC-MS. Among 17 fatty acids identified by GC-MS, 12 were detected as the side chains of lipo-alkaloids, which accounted for about 1/3 of total lipo-alkaloids, while these fatty acid residues were less than 1/4 of total fatty acid residues. And, total of 37 fatty acids were determined by UHPCL-Q-TOF-MS, including 18 oxidized fatty acids firstly identified from A. carmichaelii. These fatty acids were observed as the side chains of lipo-alkaloids. In addition, although over 140 lipo-alkaloids were identified, six lipo-alkaloids, 8-O-linoleoyl-14-benzoylmesaconine (1), 8-O-linoleoyl-14-benzoylaconine (2), 8-O-palmitoyl-14-benzoylmesaconine (3), 8-O-oleoyl-14-benzoylmesaconine (4), 8-O-pal-benzoylaconine (5), and 8-O-ole-Benzoylaconine (6), were found to be the main components, which accounted for over 90% content of total lipo-alkaloids. Therefore, using these six components as standards, a UHPLC-Triple Quadrupole-MS (UHPLC-QQQ-MS) approach was established to investigate the influence of processing on the contents of lipo-alkaloids. Although it was commonly supposed that the contents of lipo-alkaloids increased after processing, our research showed that no significant change was observed before and after processing. Using the same methods, the lipo-alkaloids in the lateral roots of A. carmichaelii and the roots of A. kusnezoffii were determined and quantified. The contents of lipo-alkaloids in A. kusnezoffii were close to that of the parent roots of A. carmichaelii, while the lateral roots had less lipo-alkaloids than the parent roots. This work was supported by Macao Science and Technology Development Fund (086/2013/A3 and 003/2016/A1).

Keywords: Aconitum carmichaelii, fatty acids, GC-MS, LC-MS, lipo-alkaloids

Procedia PDF Downloads 289
3130 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone

Abstract:

Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.

Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing

Procedia PDF Downloads 172
3129 Application to Monitor the Citizens for Corona and Get Medical Aids or Assistance from Hospitals

Authors: Vathsala Kaluarachchi, Oshani Wimalarathna, Charith Vandebona, Gayani Chandrarathna, Lakmal Rupasinghe, Windhya Rankothge

Abstract:

It is the fundamental function of a monitoring system to allow users to collect and process data. A worldwide threat, the corona outbreak has wreaked havoc in Sri Lanka, and the situation has gotten out of hand. Since the epidemic, the Sri Lankan government has been unable to establish a systematic system for monitoring corona patients and providing emergency care in the event of an outbreak. Most patients have been held at home because of the high number of patients reported in the nation, but they do not yet have access to a functioning medical system. It has resulted in an increase in the number of patients who have been left untreated because of a lack of medical care. The absence of competent medical monitoring is the biggest cause of mortality for many people nowadays, according to our survey. As a result, a smartphone app for analyzing the patient's state and determining whether they should be hospitalized will be developed. Using the data supplied, we are aiming to send an alarm letter or SMS to the hospital once the system recognizes them. Since we know what those patients need and when they need it, we will put up a desktop program at the hospital to monitor their progress. Deep learning, image processing and application development, natural language processing, and blockchain management are some of the components of the research solution. The purpose of this research paper is to introduce a mechanism to connect hospitals and patients even when they are physically apart. Further data security and user-friendliness are enhanced through blockchain and NLP.

Keywords: blockchain, deep learning, NLP, monitoring system

Procedia PDF Downloads 126
3128 Supergrid Modeling and Operation and Control of Multi Terminal DC Grids for the Deployment of a Meshed HVDC Grid in South Asia

Authors: Farhan Beg, Raymond Moberly

Abstract:

The Indian subcontinent is facing a massive challenge with regards to energy security in member countries, to provide reliable electricity to facilitate development across various sectors of the economy and consequently achieve the developmental targets. The instability of the current precarious situation is observable in the frequent system failures and blackouts. The deployment of interconnected electricity ‘Supergrid’ designed to carry huge quanta of power across the Indian sub-continent is proposed in this paper. Besides enabling energy security in the subcontinent, it will also provide a platform for Renewable Energy Sources (RES) integration. This paper assesses the need and conditions for a Supergrid deployment and consequently proposes a meshed topology based on Voltage Source High Voltage Direct Current (VSC-HVDC) converters for the Supergrid modeling. Various control schemes for the control of voltage and power are utilized for the regulation of the network parameters. A 3 terminal Multi Terminal Direct Current (MTDC) network is used for the simulations.

Keywords: super grid, wind and solar energy, high voltage direct current, electricity management, load flow analysis

Procedia PDF Downloads 418
3127 Learner's Difficulties Acquiring English: The Case of Native Speakers of Rio de La Plata Spanish Towards Justifying the Need for Corpora

Authors: Maria Zinnia Bardas Hoffmann

Abstract:

Contrastive Analysis (CA) is the systematic comparison between two languages. It stems from the notion that errors are caused by interference of the L1 system in the acquisition process of an L2. CA represents a useful tool to understand the nature of learning and acquisition. Also, this particular method promises a path to un-derstand the nature of underlying cognitive processes, even when other factors such as intrinsic motivation and teaching strategies were found to best explain student’s problems in acquisition. CA study is justified not only from the need to get a deeper understanding of the nature of SLA, but as an invaluable source to provide clues, at a cognitive level, for those general processes involved in rule formation and abstract thought. It is relevant for cross disciplinary studies and the fields of Computational Thought, Natural Language processing, Applied Linguistics, Cognitive Linguistics and Math Theory. That being said, this paper intends to address here as well its own set of constraints and limitations. Finally, this paper: (a) aims at identifying some of the difficulties students may find in their learning process due to the nature of their specific variety of L1, Rio de la Plata Spanish (RPS), (b) represents an attempt to discuss the necessity for specific models to approach CA.

Keywords: second language acquisition, applied linguistics, contrastive analysis, applied contrastive analysis English language department, meta-linguistic rules, cross-linguistics studies, computational thought, natural language processing

Procedia PDF Downloads 133
3126 Multi-Criteria Decision-Making Evaluations for Oily Waste Management of Marine Oil Spill

Authors: Naznin Sultana Daisy, Mohammad Hesam Hafezi, Lei Liu

Abstract:

Nowadays, oily solid waste management has become an important issue for many countries due to frequent oil spill accidents and the increase of industrial oily wastewater. The historical oil spill data show that marine oil spills that affect the shoreline can, in extreme cases, produce up to 30 or 40 times more waste than the volume of oil initially released. Hence, responsive authorities aim to develop the most effective oily waste management solution in a timely manner to manage and minimize the waste generated. In this study initially, we tried to develop the roadmap of oily waste management for three-tiered spill scenarios for Atlantic Canada. For that purpose, three oily waste disposal scenarios are evaluated via six criteria which are determined according to the opinions of the experts from the field. Consequently, through sustainable response strategies, the most appropriate and feasible scenario is determined. The results of this study will assist to develop an integrated oily waste management system for identifying the optimal waste-generation-allocation-disposal schemes and generating the optimal management alternatives based on the holistic consideration of environmental, technological, economic, social, and regulatory factors.

Keywords: oily waste management, marine oil spill, multi-criteria decision making, oil spill response

Procedia PDF Downloads 122
3125 A Real-World Roadmap and Exploration of Quantum Computers Capacity to Trivialise Internet Security

Authors: James Andrew Fitzjohn

Abstract:

This paper intends to discuss and explore the practical aspects of cracking encrypted messages with quantum computers. The theory of this process has been shown and well described both in academic papers and headline-grabbing news articles, but with all theory and hyperbole, we must be careful to assess the practicalities of these claims. Therefore, we will use real-world devices and proof of concept code to prove or disprove the notion that quantum computers will render the encryption technologies used by many websites unfit for purpose. It is time to discuss and implement the practical aspects of the process as many advances in quantum computing hardware/software have recently been made. This paper will set expectations regarding the useful lifespan of RSA and cipher lengths and propose alternative encryption technologies. We will set out comprehensive roadmaps describing when and how encryption schemes can be used, including when they can no longer be trusted. The cost will also be factored into our investigation; for example, it would make little financial sense to spend millions of dollars on a quantum computer to factor a private key in seconds when a commodity GPU could perform the same task in hours. It is hoped that the real-world results depicted in this paper will help influence the owners of websites who can take appropriate actions to improve the security of their provisions.

Keywords: quantum computing, encryption, RSA, roadmap, real world

Procedia PDF Downloads 115
3124 A Review on Higher-Order Spline Techniques for Solving Burgers Equation Using B-Spline Methods and Variation of B-Spline Techniques

Authors: Maryam Khazaei Pool, Lori Lewis

Abstract:

This is a summary of articles based on higher order B-splines methods and the variation of B-spline methods such as Quadratic B-spline Finite Elements Method, Exponential Cubic B-Spline Method, Septic B-spline Technique, Quintic B-spline Galerkin Method, and B-spline Galerkin Method based on the Quadratic B-spline Galerkin method (QBGM) and Cubic B-spline Galerkin method (CBGM). In this paper, we study the B-spline methods and variations of B-spline techniques to find a numerical solution to the Burgers’ equation. A set of fundamental definitions, including Burgers equation, spline functions, and B-spline functions, are provided. For each method, the main technique is discussed as well as the discretization and stability analysis. A summary of the numerical results is provided, and the efficiency of each method presented is discussed. A general conclusion is provided where we look at a comparison between the computational results of all the presented schemes. We describe the effectiveness and advantages of these methods.

Keywords: Burgers’ equation, Septic B-spline, modified cubic B-spline differential quadrature method, exponential cubic B-spline technique, B-spline Galerkin method, quintic B-spline Galerkin method

Procedia PDF Downloads 109
3123 Theorizing Optimal Use of Numbers and Anecdotes: The Science of Storytelling in Newsrooms

Authors: Hai L. Tran

Abstract:

When covering events and issues, the news media often employ both personal accounts as well as facts and figures. However, the process of using numbers and narratives in the newsroom is mostly operated through trial and error. There is a demonstrated need for the news industry to better understand the specific effects of storytelling and data-driven reporting on the audience as well as explanatory factors driving such effects. In the academic world, anecdotal evidence and statistical evidence have been studied in a mutually exclusive manner. Existing research tends to treat pertinent effects as though the use of one form precludes the other and as if a tradeoff is required. Meanwhile, narratives and statistical facts are often combined in various communication contexts, especially in news presentations. There is value in reconceptualizing and theorizing about both relative and collective impacts of numbers and narratives as well as the mechanism underlying such effects. The current undertaking seeks to link theory to practice by providing a complete picture of how and why people are influenced by information conveyed through quantitative and qualitative accounts. Specifically, the cognitive-experiential theory is invoked to argue that humans employ two distinct systems to process information. The rational system requires the processing of logical evidence effortful analytical cognitions, which are affect-free. Meanwhile, the experiential system is intuitive, rapid, automatic, and holistic, thereby demanding minimum cognitive resources and relating to the experience of affect. In certain situations, one system might dominate the other, but rational and experiential modes of processing operations in parallel and at the same time. As such, anecdotes and quantified facts impact audience response differently and a combination of data and narratives is more effective than either form of evidence. In addition, the present study identifies several media variables and human factors driving the effects of statistics and anecdotes. An integrative model is proposed to explain how message characteristics (modality, vividness, salience, congruency, position) and individual differences (involvement, numeracy skills, cognitive resources, cultural orientation) impact selective exposure, which in turn activates pertinent modes of processing, and thereby induces corresponding responses. The present study represents a step toward bridging theoretical frameworks from various disciplines to better understand the specific effects and the conditions under which the use of anecdotal evidence and/or statistical evidence enhances or undermines information processing. In addition to theoretical contributions, this research helps inform news professionals about the benefits and pitfalls of incorporating quantitative and qualitative accounts in reporting. It proposes a typology of possible scenarios and appropriate strategies for journalists to use when presenting news with anecdotes and numbers.

Keywords: data, narrative, number, anecdote, storytelling, news

Procedia PDF Downloads 72
3122 Conduction Accompanied With Transient Radiative Heat Transfer Using Finite Volume Method

Authors: A. Ashok, K.Satapathy, B. Prerana Nashine

Abstract:

The objective of this research work is to investigate for one dimensional transient radiative transfer equations with conduction using finite volume method. Within the infrastructure of finite-volume, we obtain the conservative discretization of the terms in order to preserve the overall conservative property of finitevolume schemes. Coupling of conductive and radiative equation resulting in fluxes is governed by the magnitude of emissivity, extinction coefficient, and temperature of the medium as well as geometry of the problem. The problem under consideration has been solved, for a slab dominating radiation coupled with transient conduction based on finite volume method. The boundary conditions are also chosen so as to give a good model of the discretized form of radiation transfer equation. The important feature of the present method is flexibility in specifying the control angles in the FVM, while keeping the simplicity in the solution procedure. Effects of various model parameters are examined on the distributions of temperature, radiative and conductive heat fluxes and incident radiation energy etc. The finite volume method is considered to effectively evaluate the propagation of radiation intensity through a participating medium.

Keywords: participating media, finite volume method, radiation coupled with conduction, transient radiative heat transfer

Procedia PDF Downloads 376
3121 A Palmprint Identification System Based Multi-Layer Perceptron

Authors: David P. Tantua, Abdulkader Helwan

Abstract:

Biometrics has been recently used for the human identification systems using the biological traits such as the fingerprints and iris scanning. Identification systems based biometrics show great efficiency and accuracy in such human identification applications. However, these types of systems are so far based on some image processing techniques only, which may decrease the efficiency of such applications. Thus, this paper aims to develop a human palmprint identification system using multi-layer perceptron neural network which has the capability to learn using a backpropagation learning algorithms. The developed system uses images obtained from a public database available on the internet (CASIA). The processing system is as follows: image filtering using median filter, image adjustment, image skeletonizing, edge detection using canny operator to extract features, clear unwanted components of the image. The second phase is to feed those processed images into a neural network classifier which will adaptively learn and create a class for each different image. 100 different images are used for training the system. Since this is an identification system, it should be tested with the same images. Therefore, the same 100 images are used for testing it, and any image out of the training set should be unrecognized. The experimental results shows that this developed system has a great accuracy 100% and it can be implemented in real life applications.

Keywords: biometrics, biological traits, multi-layer perceptron neural network, image skeletonizing, edge detection using canny operator

Procedia PDF Downloads 360
3120 Multi-Granularity Feature Extraction and Optimization for Pathological Speech Intelligibility Evaluation

Authors: Chunying Fang, Haifeng Li, Lin Ma, Mancai Zhang

Abstract:

Speech intelligibility assessment is an important measure to evaluate the functional outcomes of surgical and non-surgical treatment, speech therapy and rehabilitation. The assessment of pathological speech plays an important role in assisting the experts. Pathological speech usually is non-stationary and mutational, in this paper, we describe a multi-granularity combined feature schemes, and which is optimized by hierarchical visual method. First of all, the difference granularity level pathological features are extracted which are BAFS (Basic acoustics feature set), local spectral characteristics MSCC (Mel s-transform cepstrum coefficients) and nonlinear dynamic characteristics based on chaotic analysis. Latterly, radar chart and F-score are proposed to optimize the features by the hierarchical visual fusion. The feature set could be optimized from 526 to 96-dimensions.The experimental results denote that new features by support vector machine (SVM) has the best performance, with a recognition rate of 84.4% on NKI-CCRT corpus. The proposed method is thus approved to be effective and reliable for pathological speech intelligibility evaluation.

Keywords: pathological speech, multi-granularity feature, MSCC (Mel s-transform cepstrum coefficients), F-score, radar chart

Procedia PDF Downloads 270
3119 Scalable Systolic Multiplier over Binary Extension Fields Based on Two-Level Karatsuba Decomposition

Authors: Chiou-Yng Lee, Wen-Yo Lee, Chieh-Tsai Wu, Cheng-Chen Yang

Abstract:

Shifted polynomial basis (SPB) is a variation of polynomial basis representation. SPB has potential for efficient bit-level and digit-level implementations of multiplication over binary extension fields with subquadratic space complexity. For efficient implementation of pairing computation with large finite fields, this paper presents a new SPB multiplication algorithm based on Karatsuba schemes, and used that to derive a novel scalable multiplier architecture. Analytical results show that the proposed multiplier provides a trade-off between space and time complexities. Our proposed multiplier is modular, regular, and suitable for very-large-scale integration (VLSI) implementations. It involves less area complexity compared to the multipliers based on traditional decomposition methods. It is therefore, more suitable for efficient hardware implementation of pairing based cryptography and elliptic curve cryptography (ECC) in constraint driven applications.

Keywords: digit-serial systolic multiplier, elliptic curve cryptography (ECC), Karatsuba algorithm (KA), shifted polynomial basis (SPB), pairing computation

Procedia PDF Downloads 352