Search results for: fuzzy page identification
2878 SPR Immunosensor for the Detection of Staphylococcus aureus
Authors: Muhammad Ali Syed, Arshad Saleem Bhatti, Chen-zhong Li, Habib Ali Bokhari
Abstract:
Surface plasmon resonance (SPR) biosensors have emerged as a promising technique for bioanalysis as well as microbial detection and identification. Real time, sensitive, cost effective, and label free detection of biomolecules from complex samples is required for early and accurate diagnosis of infectious diseases. Like many other types of optical techniques, SPR biosensors may also be successfully utilized for microbial detection for accurate, point of care, and rapid results. In the present study, we have utilized a commercially available automated SPR biosensor of BI company to study the microbial detection form water samples spiked with different concentration of Staphylococcus aureus bacterial cells. The gold thin film sensor surface was functionalized to react with proteins such as protein G, which was used for directed immobilization of monoclonal antibodies against Staphylococcus aureus. The results of our work reveal that this immunosensor can be used to detect very small number of bacterial cells with higher sensitivity and specificity. In our case 10^3 cells/ml of water have been successfully detected. Therefore, it may be concluded that this technique has a strong potential to be used in microbial detection and identification.Keywords: surface plasmon resonance (SPR), Staphylococcus aureus, biosensors, microbial detection
Procedia PDF Downloads 4762877 Off-Line Text-Independent Arabic Writer Identification Using Optimum Codebooks
Authors: Ahmed Abdullah Ahmed
Abstract:
The task of recognizing the writer of a handwritten text has been an attractive research problem in the document analysis and recognition community with applications in handwriting forensics, paleography, document examination and handwriting recognition. This research presents an automatic method for writer recognition from digitized images of unconstrained writings. Although a great effort has been made by previous studies to come out with various methods, their performances, especially in terms of accuracy, are fallen short, and room for improvements is still wide open. The proposed technique employs optimal codebook based writer characterization where each writing sample is represented by a set of features computed from two codebooks, beginning and ending. Unlike most of the classical codebook based approaches which segment the writing into graphemes, this study is based on fragmenting a particular area of writing which are beginning and ending strokes. The proposed method starting with contour detection to extract significant information from the handwriting and the curve fragmentation is then employed to categorize the handwriting into Beginning and Ending zones into small fragments. The similar fragments of beginning strokes are grouped together to create Beginning cluster, and similarly, the ending strokes are grouped to create the ending cluster. These two clusters lead to the development of two codebooks (beginning and ending) by choosing the center of every similar fragments group. Writings under study are then represented by computing the probability of occurrence of codebook patterns. The probability distribution is used to characterize each writer. Two writings are then compared by computing distances between their respective probability distribution. The evaluations carried out on ICFHR standard dataset of 206 writers using Beginning and Ending codebooks separately. Finally, the Ending codebook achieved the highest identification rate of 98.23%, which is the best result so far on ICFHR dataset.Keywords: off-line text-independent writer identification, feature extraction, codebook, fragments
Procedia PDF Downloads 5132876 A Cognitive Semantic Analysis of the Metaphorical Extensions of Come out and Take Over
Authors: Raquel Rossini, Edelvais Caldeira
Abstract:
The aim of this work is to investigate the motivation for the metaphorical uses of two verb combinations: come out and take over. Drawing from cognitive semantics theories, image schemas and metaphors, it was attempted to demonstrate that: a) the metaphorical senses of both 'come out' and 'take over' extend from both the verbs and the particles central (spatial) senses in such verb combinations; and b) the particles 'out' and 'over' also contribute to the whole meaning of the verb combinations. In order to do so, a random selection of 579 concordance lines for come out and 1,412 for take over was obtained from the Corpus of Contemporary American English – COCA. One of the main procedures adopted in the present work was the establishment of verb and particle central senses. As per the research questions addressed in this study, they are as follows: a) how does the identification of trajector and landmark help reveal patterns that contribute for the identification of the semantic network of these two verb combinations?; b) what is the relationship between the schematic structures attributed to the particles and the metaphorical uses found in empirical data?; and c) what conceptual metaphors underlie the mappings from the source to the target domains? The results demonstrated that not only the lexical verbs come and take, but also the particles out and over play an important whole in the different meanings of come out and take over. Besides, image schemas and conceptual metaphors were found to be helpful in order to establish the motivations for the metaphorical uses of these linguistic structures.Keywords: cognitive linguistics, English syntax, multi-word verbs, prepositions
Procedia PDF Downloads 1562875 Comparative Study of Water Quality Parameters in the Proximity of Various Landfills Sites in India
Authors: Abhishek N. Srivastava, Rahul Singh, Sumedha Chakma
Abstract:
The rapid urbanization in the developing countries is generating an enormous amount of waste leading to the creation of unregulated landfill sites at various places at its disposal. The liquid waste, known as leachate, produced from these landfills sites is severely affecting the surrounding water quality. The water quality in the proximity areas of the landfill is found affected by various physico-chemical parameters of leachate such as pH, alkalinity, total hardness, conductivity, chloride, total dissolved solids (TDS), total suspended solids (TSS), sulphate, nitrate, phosphate, fluoride, sodium and potassium, biological parameters such as biochemical oxygen demand (BOD), chemical oxygen demand (COD), Faecal coliform, and heavy metals such as cadmium (Cd), lead (Pb), iron (Fe), mercury (Hg), arsenic (As), cobalt (Co), manganese (Mn), zinc (Zn), copper (Cu), chromium (Cr), nickel (Ni). However, all these parameters are distributive in leachate that produced according to the nature of waste being dumped at various landfill sites, therefore, it becomes very difficult to predict the main responsible parameter of leachate for water quality contamination. The present study is endeavour the comparative analysis of the physical, chemical and biological parameters of various landfills in India viz. Okhla landfill, Ghazipur landfill, Bhalswa ladfill in NCR Delhi, Deonar landfill in Mumbai, Dhapa landfill in Kolkata and Kodungayaiyur landfill, Perungudi landfill in Chennai. The statistical analysis of the parameters was carried out using the Statistical Packages for the Social Sciences (SPSS) and LandSim 2.5 model to simulate the long term effect of various parameters on different time scale. Further, the uncertainties characterization of various input parameters has also been analysed using fuzzy alpha cut (FAC) technique to check the sensitivity of various water quality parameters at the proximity of numerous landfill sites. Finally, the study would help to suggest the best method for the prevention of pollution migration from the landfill sites on priority basis.Keywords: landfill leachate, water quality, LandSim, fuzzy alpha cut
Procedia PDF Downloads 1252874 Impact of Chimerism on Y-STR DNA Determination: Sex Mismatch Analysis
Authors: Anupuma Raina, Ajay P. Balayan, Prateek Pandya, Pankaj Shrivastava, Uma Kanga, Tulika Seth
Abstract:
DNA fingerprinting analysis aids in personal identification for forensic purposes and has always been a driving motivation for law enforcement agencies in almost all countries since its inception. The introduction of DNA markers (Y-STR) has allowed for greater precision and higher discriminatory power in forensic testing. A criminal/ person committing crime after bone marrow transplantation is a rare situation but not an impossible one. Keeping such a situation in mind, a study was carried out to find out the best biological sample to be used for personal identification, especially in forensic situation. We choose a female patient (recipient) and a male donor. The pre transplant sample (blood) and post transplant samples (blood, buccal swab, hair roots) were collected from the recipient (patient). The same were compared with the blood sample of the donor using DNA FP technique. Post transplant samples were collected at different interval of time (15, 30, 60, and 90 days). The study was carried out using Y-STR kit at 23 loci. The results determined discusses the phenomenon of chimerism and its impact on Y-STR. Hair sample was found the most suitable sample which had no donor DNA profiling up to 90 days.Keywords: bone marrow transplantation, chimerism, DNA profiling, Y-STR
Procedia PDF Downloads 1492873 A Constructionist View of Projects, Social Media and Tacit Knowledge in a College Classroom: An Exploratory Study
Authors: John Zanetich
Abstract:
Designing an educational activity that encourages inquiry and collaboration is key to engaging students in meaningful learning. Educational Information and Communications Technology (EICT) plays an important role in facilitating cooperative and collaborative learning in the classroom. The EICT also facilitates students’ learning and development of the critical thinking skills needed to solve real world problems. Projects and activities based on constructivism encourage students to embrace complexity as well as find relevance and joy in their learning. It also enhances the students’ capacity for creative and responsible real-world problem solving. Classroom activities based on constructivism offer students an opportunity to develop the higher–order-thinking skills of defining problems and identifying solutions. Participating in a classroom project is an activity for both acquiring experiential knowledge and applying new knowledge to practical situations. It also provides an opportunity for students to integrate new knowledge into a skill set using reflection. Classroom projects can be developed around a variety of learning objects including social media, knowledge management and learning communities. The construction of meaning through project-based learning is an approach that encourages interaction and problem-solving activities. Projects require active participation, collaboration and interaction to reach the agreed upon outcomes. Projects also serve to externalize the invisible cognitive and social processes taking place in the activity itself and in the student experience. This paper describes a classroom project designed to elicit interactions by helping students to unfreeze existing knowledge, to create new learning experiences, and then refreeze the new knowledge. Since constructivists believe that students construct their own meaning through active engagement and participation as well as interactions with others. knowledge management can be used to guide the exchange of both tacit and explicit knowledge in interpersonal interactions between students and guide the construction of meaning. This paper uses an action research approach to the development of a classroom project and describes the use of technology, social media and the active use of tacit knowledge in the college classroom. In this project, a closed group Facebook page becomes the virtual classroom where interaction is captured and measured using engagement analytics. In the virtual learning community, the principles of knowledge management are used to identify the process and components of the infrastructure of the learning process. The project identifies class member interests and measures student engagement in a learning community by analyzing regular posting on the Facebook page. These posts are used to foster and encourage interactions, reflect a student’s interest and serve as reaction points from which viewers of the post convert the explicit information in the post to implicit knowledge. The data was collected over an academic year and was provided, in part, by the Google analytic reports on Facebook and self-reports of posts by members. The results support the use of active tacit knowledge activities, knowledge management and social media to enhance the student learning experience and help create the knowledge that will be used by students to construct meaning.Keywords: constructivism, knowledge management, tacit knowledge, social media
Procedia PDF Downloads 2152872 Exploring Socio-Economic Barriers of Green Entrepreneurship in Iran and Their Interactions Using Interpretive Structural Modeling
Authors: Younis Jabarzadeh, Rahim Sarvari, Negar Ahmadi Alghalandis
Abstract:
Entrepreneurship at both individual and organizational level is one of the most driving forces in economic development and leads to growth and competition, job generation and social development. Especially in developing countries, the role of entrepreneurship in economic and social prosperity is more emphasized. But the effect of global economic development on the environment is undeniable, especially in negative ways, and there is a need to rethink current business models and the way entrepreneurs act to introduce new businesses to address and embed environmental issues in order to achieve sustainable development. In this paper, green or sustainable entrepreneurship is addressed in Iran to identify challenges and barriers entrepreneurs in the economic and social sectors face in developing green business solutions. Sustainable or green entrepreneurship has been gaining interest among scholars in recent years and addressing its challenges and barriers need much more attention to fill the gap in the literature and facilitate the way those entrepreneurs are pursuing. This research comprised of two main phases: qualitative and quantitative. At qualitative phase, after a thorough literature review, fuzzy Delphi method is utilized to verify those challenges and barriers by gathering a panel of experts and surveying them. In this phase, several other contextually related factors were added to the list of identified barriers and challenges mentioned in the literature. Then, at the quantitative phase, Interpretive Structural Modeling is applied to construct a network of interactions among those barriers identified at the previous phase. Again, a panel of subject matter experts comprised of academic and industry experts was surveyed. The results of this study can be used by policymakers in both the public and industry sector, to introduce more systematic solutions to eliminate those barriers and help entrepreneurs overcome challenges of sustainable entrepreneurship. It also contributes to the literature as the first research in this type which deals with the barriers of sustainable entrepreneurship and explores their interaction.Keywords: green entrepreneurship, barriers, fuzzy Delphi method, interpretive structural modeling
Procedia PDF Downloads 1672871 Gold Nanoprobes Assay for the Identification of Foodborn Pathogens Such as Staphylococcus aureus, Listeria monocytogenes and Salmonella enteritis
Authors: D. P. Houhoula, J. Papaparaskevas, S. Konteles, A. Dargenta, A. Farka, C. Spyrou, M. Ziaka, S. Koussisis, E. Charvalos
Abstract:
Objectives: Nanotechnology is providing revolutionary opportunities for the rapid and simple diagnosis of many infectious diseases. Staphylococcus aureus, Listeria monocytogenes and Salmonella enteritis are important human pathogens. Diagnostic assays for bacterial culture and identification are time consuming and laborious. There is an urgent need to develop rapid, sensitive, and inexpensive diagnostic tests. In this study, a gold nanoprobe strategy developed and relies on the colorimetric differentiation of specific DNA sequences based approach on differential aggregation profiles in the presence or absence of specific target hybridization. Method: Gold nanoparticles (AuNPs) were purchased from Nanopartz. They were conjugated with thiolated oligonucleotides specific for the femA gene for the identification of members of Staphylococcus aureus, the mecA gene for the differentiation of Staphylococcus aureus and MRSA Staphylococcus aureus, hly gene encoding the pore-forming cytolysin listeriolysin for the identification of Listeria monocytogenes and the invA sequence for the identification of Salmonella enteritis. DNA isolation from Staphylococcus aureus Listeria monocytogenes and Salmonella enteritis cultures was performed using the commercial kit Nucleospin Tissue (Macherey Nagel). Specifically 20μl of DNA was diluted in 10mMPBS (pH5). After the denaturation of 10min, 20μl of AuNPs was added followed by the annealing step at 58oC. The presence of a complementary target prevents aggregation with the addition of acid and the solution remains pink, whereas in the opposite event it turns to purple. The color could be detected visually and it was confirmed with an absorption spectrum. Results: Specifically, 0.123 μg/μl DNA of St. aureus, L.monocytogenes and Salmonella enteritis was serially diluted from 1:10 to 1:100. Blanks containing PBS buffer instead of DNA were used. The application of the proposed method on isolated bacteria produced positive results with all the species of St. aureus and L. monocytogenes and Salmonella enteritis using the femA, mecA, hly and invA genes respectively. The minimum detection limit of the assay was defined at 0.2 ng/μL of DNA. Below of 0.2 ng/μL of bacterial DNA the solution turned purple after addition of HCl, defining the minimum detection limit of the assay. None of the blank samples was positive. The specificity was 100%. The application of the proposed method produced exactly the same results every time (n = 4) the evaluation was repeated (100% repeatability) using the femA, hly and invA genes. Using the gene mecA for the differentiation of Staphylococcus aureus and MRSA Staphylococcus aureus the method had a repeatability 50%. Conclusion: The proposed method could be used as a highly specific and sensitive screening tool for the detection and differentiation of Staphylococcus aureus Listeria monocytogenes and Salmonella enteritis. The use AuNPs for the colorimetric detection of DNA targets represents an inexpensive and easy-to-perform alternative to common molecular assays. The technology described here, may develop into a platform that could accommodate detection of many bacterial species.Keywords: gold nanoparticles, pathogens, nanotechnology, bacteria
Procedia PDF Downloads 3412870 An Approach of Computer Modalities for Exploration of Hieroglyphics Substantial in an Investigation
Authors: Aditi Chauhan, Neethu S. Mohan
Abstract:
In the modern era, the advancement and digitalization in technology have taken place during an investigation of crime scene. The rapid enhancement and investigative techniques have changed the mean of identification of suspect. Identification of the person is one of the significant aspects, and personal authentication is the key of security and reliability in society. Since early 90 s, people have relied on comparing handwriting through its class and individual characteristics. But in today’s 21st century we need more reliable means to identify individual through handwriting. An approach employing computer modalities have lately proved itself auspicious enough in exploration of hieroglyphics substantial in investigating the case. Various software’s such as FISH, WRITEON, and PIKASO, CEDAR-FOX SYSTEM identify and verify the associated quantitative measure of the similarity between two samples. The research till date has been confined to identify the authorship of the concerned samples. But prospects associated with the use of computational modalities might help to identify disguised writing, forged handwriting or say altered or modified writing. Considering the applications of such modal, similar work is sure to attract plethora of research in immediate future. It has a promising role in national security too. Documents exchanged among terrorist can also be brought under the radar of surveillance, bringing forth their source of existence.Keywords: documents, identity, computational system, suspect
Procedia PDF Downloads 1772869 A Comparative Assessment of Information Value, Fuzzy Expert System Models for Landslide Susceptibility Mapping of Dharamshala and Surrounding, Himachal Pradesh, India
Authors: Kumari Sweta, Ajanta Goswami, Abhilasha Dixit
Abstract:
Landslide is a geomorphic process that plays an essential role in the evolution of the hill-slope and long-term landscape evolution. But its abrupt nature and the associated catastrophic forces of the process can have undesirable socio-economic impacts, like substantial economic losses, fatalities, ecosystem, geomorphologic and infrastructure disturbances. The estimated fatality rate is approximately 1person /100 sq. Km and the average economic loss is more than 550 crores/year in the Himalayan belt due to landslides. This study presents a comparative performance of a statistical bivariate method and a machine learning technique for landslide susceptibility mapping in and around Dharamshala, Himachal Pradesh. The final produced landslide susceptibility maps (LSMs) with better accuracy could be used for land-use planning to prevent future losses. Dharamshala, a part of North-western Himalaya, is one of the fastest-growing tourism hubs with a total population of 30,764 according to the 2011 census and is amongst one of the hundred Indian cities to be developed as a smart city under PM’s Smart Cities Mission. A total of 209 landslide locations were identified in using high-resolution linear imaging self-scanning (LISS IV) data. The thematic maps of parameters influencing landslide occurrence were generated using remote sensing and other ancillary data in the GIS environment. The landslide causative parameters used in the study are slope angle, slope aspect, elevation, curvature, topographic wetness index, relative relief, distance from lineaments, land use land cover, and geology. LSMs were prepared using information value (Info Val), and Fuzzy Expert System (FES) models. Info Val is a statistical bivariate method, in which information values were calculated as the ratio of the landslide pixels per factor class (Si/Ni) to the total landslide pixel per parameter (S/N). Using this information values all parameters were reclassified and then summed in GIS to obtain the landslide susceptibility index (LSI) map. The FES method is a machine learning technique based on ‘mean and neighbour’ strategy for the construction of fuzzifier (input) and defuzzifier (output) membership function (MF) structure, and the FR method is used for formulating if-then rules. Two types of membership structures were utilized for membership function Bell-Gaussian (BG) and Trapezoidal-Triangular (TT). LSI for BG and TT were obtained applying membership function and if-then rules in MATLAB. The final LSMs were spatially and statistically validated. The validation results showed that in terms of accuracy, Info Val (83.4%) is better than BG (83.0%) and TT (82.6%), whereas, in terms of spatial distribution, BG is best. Hence, considering both statistical and spatial accuracy, BG is the most accurate one.Keywords: bivariate statistical techniques, BG and TT membership structure, fuzzy expert system, information value method, machine learning technique
Procedia PDF Downloads 1292868 Electrical Decomposition of Time Series of Power Consumption
Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats
Abstract:
Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).Keywords: electrical disaggregation, DTW, general appliance modeling, event detection
Procedia PDF Downloads 782867 Identification of EEG Attention Level Using Empirical Mode Decompositions for BCI Applications
Authors: Chia-Ju Peng, Shih-Jui Chen
Abstract:
This paper proposes a method to discriminate electroencephalogram (EEG) signals between different concentration states using empirical mode decomposition (EMD). Brain-computer interface (BCI), also called brain-machine interface, is a direct communication pathway between the brain and an external device without the inherent pathway such as the peripheral nervous system or skeletal muscles. Attention level is a common index as a control signal of BCI systems. The EEG signals acquired from people paying attention or in relaxation, respectively, are decomposed into a set of intrinsic mode functions (IMF) by EMD. Fast Fourier transform (FFT) analysis is then applied to each IMF to obtain the frequency spectrums. By observing power spectrums of IMFs, the proposed method has the better identification of EEG attention level than the original EEG signals between different concentration states. The band power of IMF3 is the most obvious especially in β wave, which corresponds to fully awake and generally alert. The signal processing method and results of this experiment paves a new way for BCI robotic system using the attention-level control strategy. The integrated signal processing method reveals appropriate information for discrimination of the attention and relaxation, contributing to a more enhanced BCI performance.Keywords: biomedical engineering, brain computer interface, electroencephalography, rehabilitation
Procedia PDF Downloads 3922866 Identification of the Interior Noise Sources of Rail Vehicles
Authors: Hyo-In Koh, Anders Nordborg, Alex Sievi, Chun-Kwon Park
Abstract:
The noise source for the interior room of the high speed train is constituted by the rolling contact between the wheel and the rail, aerodynamic noise and structure-borne sound generated through the vibrations of bogie, connection points to the carbody. Air-borne sound is radiated through the panels and structures into the interior room of the trains. The high-speed lines are constructed with slab track systems and many tunnels. The interior noise level and the frequency characteristics vary according to types of the track structure and the infrastructure. In this paper the main sound sources and the transfer paths are studied to find out the contribution characteristics of the sources to the interior noise of a high-speed rail vehicle. For the identification of the acoustic power of each parts of the rolling noise sources a calculation model of wheel/rail noise is developed and used. For the analysis of the transmission of the sources to the interior noise noise and vibration are measured during the operation of the vehicle. According to operation speeds, the mainly contributed sources and the paths could be analyzed. Results of the calculations on the source generation and the results of the measurement with a high-speed train are shown and discussed.Keywords: rail vehicle, high-speed, interior noise, noise source
Procedia PDF Downloads 4002865 Identification and Classification of Medicinal Plants of Indian Himalayan Region Using Hyperspectral Remote Sensing and Machine Learning Techniques
Authors: Kishor Chandra Kandpal, Amit Kumar
Abstract:
The Indian Himalaya region harbours approximately 1748 plants of medicinal importance, and as per International Union for Conservation of Nature (IUCN), the 112 plant species among these are threatened and endangered. To ease the pressure on these plants, the government of India is encouraging its in-situ cultivation. The Saussurea costus, Valeriana jatamansi, and Picrorhiza kurroa have also been prioritized for large scale cultivation owing to their market demand, conservation value and medicinal properties. These species are found from 1000 m to 4000 m elevation ranges in the Indian Himalaya. Identification of these plants in the field requires taxonomic skills, which is one of the major bottleneck in the conservation and management of these plants. In recent years, Hyperspectral remote sensing techniques have been precisely used for the discrimination of plant species with the help of their unique spectral signatures. In this background, a spectral library of the above 03 medicinal plants was prepared by collecting the spectral data using a handheld spectroradiometer (325 to 1075 nm) from farmer’s fields of Himachal Pradesh and Uttarakhand states of Indian Himalaya. The Random forest (RF) model was implied on the spectral data for the classification of the medicinal plants. The 80:20 standard split ratio was followed for training and validation of the RF model, which resulted in training accuracy of 84.39 % (kappa coefficient = 0.72) and testing accuracy of 85.29 % (kappa coefficient = 0.77). This RF classifier has identified green (555 to 598 nm), red (605 nm), and near-infrared (725 to 840 nm) wavelength regions suitable for the discrimination of these species. The findings of this study have provided a technique for rapid and onsite identification of the above medicinal plants in the field. This will also be a key input for the classification of hyperspectral remote sensing images for mapping of these species in farmer’s field on a regional scale. This is a pioneer study in the Indian Himalaya region for medicinal plants in which the applicability of hyperspectral remote sensing has been explored.Keywords: himalaya, hyperspectral remote sensing, machine learning; medicinal plants, random forests
Procedia PDF Downloads 2042864 Tropical Squall Lines in Brazil: A Methodology for Identification and Analysis Based on ISCCP Tracking Database
Authors: W. A. Gonçalves, E. P. Souza, C. R. Alcântara
Abstract:
The ISCCP-Tracking database offers an opportunity to study physical and morphological characteristics of Convective Systems based on geostationary meteorological satellites. This database contains 26 years of tracking of Convective Systems for the entire globe. Then, Tropical Squall Lines which occur in Brazil are certainly within the database. In this study, we propose a methodology for identification of these systems based on the ISCCP-Tracking database. A physical and morphological characterization of these systems is also shown. The proposed methodology is firstly based on the year of 2007. The Squall Lines were subjectively identified by visually analyzing infrared images from GOES-12. Based on this identification, the same systems were identified within the ISCCP-Tracking database. It is known, and it was also observed that the Squall Lines which occur on the north coast of Brazil develop parallel to the coast, influenced by the sea breeze. In addition, it was also observed that the eccentricity of the identified systems was greater than 0.7. Then, a methodology based on the inclination (based on the coast) and eccentricity (greater than 0.7) of the Convective Systems was applied in order to identify and characterize Tropical Squall Lines in Brazil. These thresholds were applied back in the ISCCP-Tracking database for the year of 2007. It was observed that other systems, which were not Squall Lines, were also identified. Then, we decided to call all systems identified by the inclination and eccentricity thresholds as Linear Convective Systems, instead of Squall Lines. After this step, the Linear Convective Systems were identified and characterized for the entire database, from 1983 to 2008. The physical and morphological characteristics of these systems were compared to those systems which did not have the required inclination and eccentricity to be called Linear Convective Systems. The results showed that the convection associated with the Linear Convective Systems seems to be more intense and organized than in the other systems. This affirmation is based on all ISCCP-Tracking variables analyzed. This type of methodology, which explores 26 years of satellite data by an objective analysis, was not previously explored in the literature. The physical and morphological characterization of the Linear Convective Systems based on 26 years of data is of a great importance and should be used in many branches of atmospheric sciences.Keywords: squall lines, convective systems, linear convective systems, ISCCP-Tracking
Procedia PDF Downloads 3012863 Vibratinal Spectroscopic Identification of Beta-Carotene in Usnic Acid and PAHs as a Potential Martian Analogue
Authors: A. I. Alajtal, H. G. M. Edwards, M. A. Elbagermi
Abstract:
Raman spectroscopy is currently a part of the instrumentation suite of the ESA ExoMars mission for the remote detection of life signatures in the Martian surface and subsurface. Terrestrial analogues of Martian sites have been identified and the biogeological modifications incurred as a result of extremophilic activity have been studied. Analytical instrumentation protocols for the unequivocal detection of biomarkers in suitable geological matrices are critical for future unmanned explorations, including the forthcoming ESA ExoMars mission to search for life on Mars scheduled for 2018 and Raman spectroscopy is currently a part of the Pasteur instrumentation suite of this mission. Here, Raman spectroscopy using 785nm excitation was evaluated for determining various concentrations of beta-carotene in admixture with polyaromatic hydrocarbons and usnic acid have been investigated by Raman microspectrometry to determine the lowest levels detectable in simulation of their potential identification remotely in geobiological conditions in Martian scenarios. Information from this study will be important for the development of a miniaturized Raman instrument for targetting Martian sites where the biosignatures of relict or extant life could remain in the geological record.Keywords: raman spectroscopy, mars-analog, beta-carotene, PAHs
Procedia PDF Downloads 3392862 Mitigating Supply Chain Risk for Sustainability Using Big Data Knowledge: Evidence from the Manufacturing Supply Chain
Authors: Mani Venkatesh, Catarina Delgado, Purvishkumar Patel
Abstract:
The sustainable supply chain is gaining popularity among practitioners because of increased environmental degradation and stakeholder awareness. On the other hand supply chain, risk management is very crucial for the practitioners as it potentially disrupts supply chain operations. Prediction and addressing the risk caused by social issues in the supply chain is paramount importance to the sustainable enterprise. More recently, the usage of Big data analytics for forecasting business trends has been gaining momentum among professionals. The aim of the research is to explore the application of big data, predictive analytics in successfully mitigating supply chain social risk and demonstrate how such mitigation can help in achieving sustainability (environmental, economic & social). The method involves the identification and validation of social issues in the supply chain by an expert panel and survey. Later, we used a case study to illustrate the application of big data in the successful identification and mitigation of social issues in the supply chain. Our result shows that the company can predict various social issues through big data, predictive analytics and mitigate the social risk. We also discuss the implication of this research to the body of knowledge and practice.Keywords: big data, sustainability, supply chain social sustainability, social risk, case study
Procedia PDF Downloads 4102861 Value Engineering Change Proposal Application in Construction of Road-Building Projects
Authors: Mohammad Mahdi Hajiali
Abstract:
Many of construction projects estimated in Iran have been influenced by the limitations of financial resources. As for Iran, a country that is developing, and to follow this development-oriented approach which many numbers of projects each year run in, if we can reduce the cost of projects by applying a method we will help greatly to minimize the cost of major construction projects and therefore projects will finish faster and more efficiently. One of the components of transportation infrastructure are roads that are considered to have a considerable share of the country budget. In addition, major budget of the related ministry is spending to repair, improve and maintain roads. Value Engineering is a simple and powerful methodology over the past six decades that has been successful in reducing the cost of many projects. Specific solution for using value engineering in the stage of project implementation is called value engineering change proposal (VECP). It was tried in this research to apply VECP in one of the road-building projects in Iran in order to enhance the value of this kind of projects and reduce their cost. In this case study after applying VECP, an idea was raised. It was about use of concrete pavement instead of hot mixed asphalt (HMA) and also using fiber in order to improve concrete pavement performance. VE group team made a decision that for choosing the best alternatives, get expert’s opinions in pavement systems and use Fuzzy TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) for ranking opinions of the experts. Finally, Jointed Plain Concrete Pavement (JPCP) was selected. Group also experimented concrete samples with available fibers in Iran and the results of experiments showed a significant increment in concrete specifications such as flexural strength. In the end, it was shown that by using of fiber-reinforced concrete pavement instead of asphalt pavement, we can achieve a significant saving in cost, time and also increment in quality, durability, and longevity.Keywords: road-building projects, value engineering change proposal (VECP), Jointed Plain Concrete Pavement (JPCP), Fuzzy TOPSIS, fiber-reinforced concrete
Procedia PDF Downloads 1982860 Empirical Decomposition of Time Series of Power Consumption
Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats
Abstract:
Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).Keywords: general appliance model, non intrusive load monitoring, events detection, unsupervised techniques;
Procedia PDF Downloads 822859 Developing a Comprehensive Framework for Sustainable Urban Planning and Design: Insights From Iranian Cities
Authors: Mohammad Javad Seddighi, Avar Almukhtar
Abstract:
Sustainable urban planning and design (SUPD) play a critical role in achieving the United Nations Sustainable Development Goals (UN SDGs). While there are many rating systems and standards available to assess the sustainability of the built environment, there is still a lack of a comprehensive framework that can assess the quality of SUPD in a specific context. In this paper, we present a framework for assessing the quality of SUPD in Iranian cities, considering their unique cultural, social, and environmental contexts. The aim of this study is to develop a framework for assessing the quality of SUPD in Iranian cities. To achieve this aim, the following objectives are pursued review and synthesis of relevant literature on SUPD, identification of key indicators and criteria for assessing the quality of SUPD in Iranian cities application of the framework to case studies of Iranian cities and evaluation and refinement of the framework based on the results of the case studies. The framework is developed based on a review and synthesis of relevant literature on SUPD, and the identification of key indicators and criteria for assessing the quality of SUPD in Iranian cities. The framework is then applied to case studies of Iranian cities and the results are evaluated and refined. The data for this study are collected through a review of relevant literature on SUPD, including academic journals, conference proceedings, and books. The case studies of Iranian cities are selected based on their relevance and availability of data. The data are collected through interviews, site visits, and document analysis. This paper presents a framework for assessing the quality of SUPD in Iranian cities. The framework is developed based on a review and synthesis of relevant literature, identification of key indicators and criteria, application to case studies, and evaluation and refinement. The framework provides a comprehensive and context-specific approach to assessing the quality of SUPD in Iranian cities. It can be used by urban planners, designers, and policymakers to improve the sustainability and liveability of Iranian cities, and it can be adapted for use in other contexts.Keywords: sustainable urban planning and design, framework, quality assessment, Iranian cities, case studies
Procedia PDF Downloads 1192858 Analysis of the AZF Region in Slovak Men with Azoospermia
Authors: J. Bernasovská, R. Lohajová Behulová, E. Petrejčiková, I. Boroňová, I. Bernasovský
Abstract:
Y chromosome microdeletions are the most common genetic cause of male infertility and screening for these microdeletions in azoospermic or severely oligospermic men is now standard practice. Analysis of the Y chromosome in men with azoospermia or severe oligozoospermia has resulted in the identification of three regions in the euchromatic part of the long arm of the human Y chromosome (Yq11) that are frequently deleted in men with otherwise unexplained spermatogenic failure. PCR analysis of microdeletions in the AZFa, AZFb and AZFc regions of the human Y chromosome is an important screening tool. The aim of this study was to analyse the type of microdeletions in men with fertility disorders in Slovakia. We evaluated 227 patients with azoospermia and with normal karyotype. All patient samples were analyzed cytogenetically. For PCR amplification of sequence-tagged sites (STS) of the AZFa, AZFb and AZFc regions of the Y chromosome was used Devyser AZF set. Fluorescently labeled primers for all markers in one multiplex PCR reaction were used and for automated visualization and identification of the STS markers we used genetic analyzer ABi 3500xl (Life Technologies). We reported 13 cases of deletions in the AZF region 5,73%. Particular types of deletions were recorded in each region AZFa,b,c .The presence of microdeletions in the AZFc region was the most frequent. The study confirmed that percentage of microdeletions in the AZF region is low in Slovak azoospermic patients, but important from a prognostic view.Keywords: AZF, male infertility, microdeletions, Y chromosome
Procedia PDF Downloads 3742857 Sprinting Beyond Sexism and Gender Stereotypes: Indian Women Fans' Experiences in the Sports Fandom
Authors: Siddhi Deshpande, Jo Jo Chacko Eapen
Abstract:
Despite almost half of India’s female population engages in watching sports, their experiences in the sports fandom are concealed by ‘traditional masculinity,’ leading to potential exclusion and harassment. To explore these experiences in-depth, this qualitative study aims to understand what coping strategies Indian women fans employ, to sustain their team identification. Employing criterion sampling, participants were screened using The Sports Spectators Identification Scale (SSIS) to assess team identification and a Brief Sexism Questionnaire to confirm participants’ experience with sexism as it aligns with the purpose of the study. The participants were Indian women who had been following any sport for more than eight years, were fluent in English, and were not professionals in Sports. Ten highly identified fans with gendered experiences were recruited for one-on-one semi-structured, in-depth interviews. The data was analyzed using Interpretive Phenomenological Analysis (IPA) to understand the lived-in experiences of women fans experiencing sexism and gender stereotypes, revealing superordinate themes of (1) Ontogenesis and Emotional Investment; (2) Gendered Expectations and Sexism; (3) Coping Strategies and Resilience; (4) Identity, Femininity, Empowerment; (5) Advocacy for Equality and Inclusivity. The findings reflect that Indian women fans experience social exclusion, harassment, sexualization, and commodification, in both online and offline fandoms, where they are disproportionately targeted with threats, misogynistic comments, and attraction-based assumptions, questioning their ‘authenticity’ as fans due to their gender. Women fans interchange between proactive strategies of assertiveness, humor, and knowledge demonstration with defensive strategies of selective engagement, self-regulatory censorship, and desensitization to deal with sexism. In this interplay, the integration of women’s ‘fan identity’ with their self-concept showcases how being a sports fan adds meaning to their lives, despite the constant scrutiny in a male-dominated space, reflecting that femininity and sports should coexist. As a result, they find refuge in female fan communities due to their similar experiences in the fandom and advocate for an equal and inclusive environment where sports are above gender, and not the other way around. A key practical implication of this research is enabling sports organizations to develop inclusive fan engagement policies that actively encourage female fan participation. This includes sensitizing stadium staff and security personnel, promoting gender-neutral language, and, most importantly, establishing safety protocols to protect female fans from adverse experiences in the fandom.Keywords: coping strategies, female sports fans, femininity, gendered experiences, team identification
Procedia PDF Downloads 602856 Recovery and Identification of Phenolic Acids in Honey Samples from Different Floral Sources of Pakistan Having Antimicrobial Activity
Authors: Samiyah Tasleem, Muhammad Abdul Haq, Syed Baqir Shyum Naqvi, Muhammad Abid Husnain, Sajjad Haider Naqvi
Abstract:
The objective of the present study was: a) to investigate the antimicrobial activity of honey samples of different floral sources of Pakistan, b) to recover the phenolic acids in them as a possible contributing factor of antimicrobial activity. Six honey samples from different floral sources, namely: Trachysperm copticum, Acacia species, Helianthus annuus, Carissa opaca, Zizyphus and Magnifera indica were used. The antimicrobial activity was investigated by the disc diffusion method against eight freshly isolated clinical isolates (Staphylococci aureus, Staphylococci epidermidis, Streptococcus faecalis, Pseudomonas aeruginosa, Klebsiella pneumonia, Escherichia coli, Proteus vulgaris and Candida albicans). Antimicrobial activity of honey was compared with five commercial antibiotics, namely: doxycycline (DO-30ug/mL), oxytetracycline (OT-30ug/mL), clarithromycin (CLR–15ug/mL), moxifloxacin (MXF-5ug/mL) and nystatin (NT – 100 UT). The fractions responsible for antimicrobial activity were extracted using ethyl acetate. Solid phase extraction (SPE) was used to recover the phenolic acids of honey samples. Identification was carried out via High-Performance Liquid Chromatography (HPLC). The results indicated that antimicrobial activity was present in all honey samples and found comparable to the antibiotics used in the study. In the microbiological assay, the ethyl acetate honey extract was found to exhibit a very promising antimicrobial activity against all the microorganisms tested, indicating the existence of phenolic compounds. Six phenolic acids, namely: gallic, caffeic, ferulic, vanillic, benzoic and cinnamic acids were identified besides some unknown substance by HPLC. In conclusion, Pakistani honey samples showed a broad spectrum antibacterial and promising antifungal activity. Identification of six different phenolic acids showed that Pakistani honey samples are rich sources of phenolic compounds that could be the contributing factor of antimicrobial activity.Keywords: Pakistani honey, antimicrobial activity, Phenolic acids eg.gallic, caffeic, ferulic, vanillic, benzoic and cinnamic acids
Procedia PDF Downloads 5492855 Dysbiosis of the Intestinal Microbiome in Colorectal Cancer Patients at Hospital of Amizour, Bejaia, Algeria
Authors: Adjebli Ahmed, Messis Abdelaziz, Ayeche Riad, Tighilet Karim, Talbi Melissa, Smaili Yanis, Lehri Mokrane, Louardiane Mustapha
Abstract:
Colorectal cancer is one of the most common types of cancer worldwide, and its incidence has been increasing in recent years. Data and fecal samples from colorectal cancer patients were collected at the Amizour Public Hospital's oncology department (Bejaia, Algeria). Microbiological and cohort study were conducted at the Biological Engineering of Cancers laboratory at the Faculty of Medicine of the University of Bejaia. All the data showed that patients aged between 50 and 70 years were the most affected by colorectal cancer, while the age categories of [30-40] and [40-50] were the least affected. Males were more likely to be at risk of contracting colorectal cancer than females. The most common types of colorectal cancer among the studied population were sigmoid cancer, rectal cancer, transverse colon cancer, and ascending colon cancer. The hereditary factor was found to be more dominant than other risk factors. Bacterial identification revealed the presence of certain pathogenic and opportunistic bacterial genera, such as E. coli, K. pneumoniae, Shigella sp, and Streptococcus group D. These results led us to conclude that dysbiosis of the intestinal microbiome is strongly present in colorectal cancer patients at the EPH of Amizour.Keywords: microbiome, colorectal cancer, risk factors, bacterial identification
Procedia PDF Downloads 882854 Design of Speed Bump Recognition System Integrated with Adjustable Shock Absorber Control
Authors: Ming-Yen Chang, Sheng-Hung Ke
Abstract:
This research focuses on the development of a speed bump identification system for real-time control of adjustable shock absorbers in vehicular suspension systems. The study initially involved the collection of images of various speed bumps, and rubber speed bump profiles found on roadways. These images were utilized for training and recognition purposes through the deep learning object detection algorithm YOLOv5. Subsequently, the trained speed bump identification program was integrated with an in-vehicle camera system for live image capture during driving. These images were instantly transmitted to a computer for processing. Using the principles of monocular vision ranging, the distance between the vehicle and an approaching speed bump was determined. The appropriate control distance was established through both practical vehicle measurements and theoretical calculations. Collaboratively, with the electronically adjustable shock absorbers equipped in the vehicle, a shock absorber control system was devised to dynamically adapt the damping force just prior to encountering a speed bump. This system effectively mitigates passenger discomfort and enhances ride quality.Keywords: adjustable shock absorbers, image recognition, monocular vision ranging, ride
Procedia PDF Downloads 672853 Non-Invasive Characterization of the Mechanical Properties of Arterial Walls
Authors: Bruno RamaëL, GwenaëL Page, Catherine Knopf-Lenoir, Olivier Baledent, Anne-Virginie Salsac
Abstract:
No routine technique currently exists for clinicians to measure the mechanical properties of vascular walls non-invasively. Most of the data available in the literature come from traction or dilatation tests conducted ex vivo on native blood vessels. The objective of the study is to develop a non-invasive characterization technique based on Magnetic Resonance Imaging (MRI) measurements of the deformation of vascular walls under pulsating blood flow conditions. The goal is to determine the mechanical properties of the vessels by inverse analysis, coupling imaging measurements and numerical simulations of the fluid-structure interactions. The hyperelastic properties are identified using Solidworks and Ansys workbench (ANSYS Inc.) solving an optimization technique. The vessel of interest targeted in the study is the common carotid artery. In vivo MRI measurements of the vessel anatomy and inlet velocity profiles was acquired along the facial vascular network on a cohort of 30 healthy volunteers: - The time-evolution of the blood vessel contours and, thus, of the cross-section surface area was measured by 3D imaging angiography sequences of phase-contrast MRI. - The blood flow velocity was measured using a 2D CINE MRI phase contrast (PC-MRI) method. Reference arterial pressure waveforms were simultaneously measured in the brachial artery using a sphygmomanometer. The three-dimensional (3D) geometry of the arterial network was reconstructed by first creating an STL file from the raw MRI data using the open source imaging software ITK-SNAP. The resulting geometry was then transformed with Solidworks into volumes that are compatible with Ansys softwares. Tetrahedral meshes of the wall and fluid domains were built using the ANSYS Meshing software, with a near-wall mesh refinement method in the case of the fluid domain to improve the accuracy of the fluid flow calculations. Ansys Structural was used for the numerical simulation of the vessel deformation and Ansys CFX for the simulation of the blood flow. The fluid structure interaction simulations showed that the systolic and diastolic blood pressures of the common carotid artery could be taken as reference pressures to identify the mechanical properties of the different arteries of the network. The coefficients of the hyperelastic law were identified using Ansys Design model for the common carotid. Under large deformations, a stiffness of 800 kPa is measured, which is of the same order of magnitude as the Young modulus of collagen fibers. Areas of maximum deformations were highlighted near bifurcations. This study is a first step towards patient-specific characterization of the mechanical properties of the facial vessels. The method is currently applied on patients suffering from facial vascular malformations and on patients scheduled for facial reconstruction. Information on the blood flow velocity as well as on the vessel anatomy and deformability will be key to improve surgical planning in the case of such vascular pathologies.Keywords: identification, mechanical properties, arterial walls, MRI measurements, numerical simulations
Procedia PDF Downloads 3192852 Precise Identification of Clustered Regularly Interspaced Short Palindromic Repeats-Induced Mutations via Hidden Markov Model-Based Sequence Alignment
Authors: Jingyuan Hu, Zhandong Liu
Abstract:
CRISPR genome editing technology has transformed molecular biology by accurately targeting and altering an organism’s DNA. Despite the state-of-art precision of CRISPR genome editing, the imprecise mutation outcome and off-target effects present considerable risk, potentially leading to unintended genetic changes. Targeted deep sequencing, combined with bioinformatics sequence alignment, can detect such unwanted mutations. Nevertheless, the classical method, Needleman-Wunsch (NW) algorithm may produce false alignment outcomes, resulting in inaccurate mutation identification. The key to precisely identifying CRISPR-induced mutations lies in determining optimal parameters for the sequence alignment algorithm. Hidden Markov models (HMM) are ideally suited for this task, offering flexibility across CRISPR systems by leveraging forward-backward algorithms for parameter estimation. In this study, we introduce CRISPR-HMM, a statistical software to precisely call CRISPR-induced mutations. We demonstrate that the software significantly improves precision in identifying CRISPR-induced mutations compared to NW-based alignment, thereby enhancing the overall understanding of the CRISPR gene-editing process.Keywords: CRISPR, HMM, sequence alignment, gene editing
Procedia PDF Downloads 542851 Giftedness Cloud Model: A Psychological and Ecological Vision of Giftedness Concept
Authors: Rimeyah H. S. Almutairi, Alaa Eldin A. Ayoub
Abstract:
The aim of this study was to identify empirical and theoretical studies that explored giftedness theories and identification. In order to assess and synthesize the mechanisms, outcomes, and impacts of gifted identification models. Thus, we sought to provide an evidence-informed answer to how does current giftedness theories work and effectiveness. In order to develop a model that incorporates the advantages of existing models and avoids their disadvantages as much as possible. We conducted a systematic literature review (SLR). The disciplined analysis resulted in a final sample consisting of 30 appropriate searches. The results indicated that: (a) there is no uniform and consistent definition of Giftedness; (b) researchers are using several non-consistent criteria to detect gifted, and (d) The detection of talent is largely limited to early ages, and there is obvious neglect of adults. This study contributes to the development of Giftedness Cloud Model (GCM) which defined as a model that attempts to interpretation giftedness within an interactive psychological and ecological framework. GCM aims to help a talented to reach giftedness core and manifestation talent in creative productivity or invention. Besides that, GCM suggests classifying giftedness into four levels of mastery, excellence, creative productivity, and manifestation. In addition, GCM presents an idea to distinguish between talent and giftedness.Keywords: giftedness cloud model, talent, systematic literature review, giftedness concept
Procedia PDF Downloads 1672850 Identification System for Grading Banana in Food Processing Industry
Authors: Ebenezer O. Olaniyi, Oyebade K. Oyedotun, Khashman Adnan
Abstract:
In the food industry high quality production is required within a limited time to meet up with the demand in the society. In this research work, we have developed a model which can be used to replace the human operator due to their low output in production and slow in making decisions as a result of an individual differences in deciding the defective and healthy banana. This model can perform the vision attributes of human operators in deciding if the banana is defective or healthy for food production based. This research work is divided into two phase, the first phase is the image processing where several image processing techniques such as colour conversion, edge detection, thresholding and morphological operation were employed to extract features for training and testing the network in the second phase. These features extracted in the first phase were used in the second phase; the classification system phase where the multilayer perceptron using backpropagation neural network was employed to train the network. After the network has learned and converges, the network was tested with feedforward neural network to determine the performance of the network. From this experiment, a recognition rate of 97% was obtained and the time taken for this experiment was limited which makes the system accurate for use in the food industry.Keywords: banana, food processing, identification system, neural network
Procedia PDF Downloads 4712849 Studying the Simultaneous Effect of Petroleum and DDT Pollution on the Geotechnical Characteristics of Sands
Authors: Sara Seyfi
Abstract:
DDT and petroleum contamination in coastal sand alters the physical and mechanical properties of contaminated soils. This article aims to understand the effects of DDT pollution on the geotechnical characteristics of sand groups, including sand, silty sand, and clay sand. First, the studies conducted on the topic of the article will be reviewed. In the initial stage of the tests, this article deals with the identification of the used sands (sand, silty sand, clay sand) by FTIR, µ-XRF and SEM methods. Then, the geotechnical characteristics of these sand groups, including density, permeability, shear strength, compaction, and plasticity, are investigated using a sand cone, head permeability test, Vane shear test, strain gauge penetrometer, and plastic limit test. Sand groups are artificially contaminated with petroleum substances with 1, 2, 4, 8, 10, 12% by weight. In a separate experiment, amounts of 2, 4, 8, 12, 16, 20 mg/liter of DDT were added to the sand groups. Geotechnical characteristics and identification analysis are performed on the contaminated samples. In the final tests, the mentioned amounts of oil pollution and DDT are simultaneously added to the sand groups, and identification and measurement processes are carried out. The results of the tests showed that petroleum contamination had reduced the optimal moisture content, permeability, and plasticity of all samples. Except silty sand’s plasticity, which petroleum increased it by 1-4% and decreased it by 8-12%. The dry density of sand and clay sand increased, but that of silty sand decreased. Also, the shear strength of sand and silty sand increased, but that of clay sand decreased. DDT contamination increased the maximum dry density and decreased the permeability of all samples. It also reduced the optimum moisture content of the sand. The shear resistance of silty sand and clayey sand decreased, and plasticity of clayey sand increased, and silty sand decreased. The simultaneous effect of petroleum and DDT pollution on the maximum dry density of sand and clayey sand has been synergistic, on the plasticity of clayey sand and silty sand, there has been antagonism. This process has caused antagonism of optimal sand content, shear strength of silty sand and clay sand. In other cases, the effect of synergy or antagonism is not observed.Keywords: DDT contamination, geotechnical characteristics, petroleum contamination, sand
Procedia PDF Downloads 51