Search results for: incidental information processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13648

Search results for: incidental information processing

13198 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA

Authors: Marek Dosbaba

Abstract:

Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.

Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data

Procedia PDF Downloads 109
13197 Strategies of Risk Management for Smallholder Farmers in South Africa: A Case Study on Pigeonpea (Cajanus cajan) Production

Authors: Sanari Chalin Moriri, Kwabena Kingsley Ayisi, Alina Mofokeng

Abstract:

Dryland smallholder farmers in South Africa are vulnerable to all kinds of risks, and it negatively affects crop productivity and profit. Pigeonpea is a leguminous and multipurpose crop that provides food, fodder, and wood for smallholder farmers. The majority of these farmers are still growing pigeonpea from traditional unimproved seeds, which comprise a mixture of genotypes. The objectives of the study were to identify the key risk factors that affect pigeonpea productivity and to develop management strategies on how to alleviate the risk factors in pigeonpea production. The study was conducted in two provinces (Limpopo and Mpumalanga) of South Africa in six municipalities during the 2020/2021 growing seasons. The non-probability sampling method using purposive and snowball sampling techniques were used to collect data from the farmers through a structured questionnaire. A total of 114 pigeonpea producers were interviewed individually using a questionnaire. Key stakeholders in each municipality were also identified, invited, and interviewed to verify the information given by farmers. Data collected were subjected to SPSS statistical software 25 version. The findings of the study were that majority of farmers affected by risk factors were women, subsistence, and old farmers resulted in low food production. Drought, unavailability of improved pigeonpea seeds for planting, access to information, and processing equipment were found to be the main risk factors contributing to low crop productivity in farmer’s fields. Above 80% of farmers lack knowledge on the improvement of the crop and also on the processing techniques to secure high prices during the crop off-season. Market availability, pricing, and incidence of pests and diseases were found to be minor risk factors which were triggered by the major risk factors. The minor risk factors can be corrected only if the major risk factors are first given the necessary attention. About 10% of the farmers found to use the crop as a mulch to reduce soil temperatures and to improve soil fertility. The study revealed that most of the farmers were unaware of its utilisation as fodder, much, medicinal, nitrogen fixation, and many more. The risk of frequent drought in dry areas of South Africa where farmers solely depend on rainfall poses a serious threat to crop productivity. The majority of these risk factors are caused by climate change due to unrealistic, low rainfall with extreme temperatures poses a threat to food security, water, and the environment. The use of drought-tolerant, multipurpose legume crops such as pigeonpea, access to new information, provision of processing equipment, and support from all stakeholders will help in addressing food security for smallholder farmers. Policies should be revisited to address the prevailing risk factors faced by farmers and involve them in addressing the risk factors. Awareness should be prioritized in promoting the crop to improve its production and commercialization in the dryland farming system of South Africa.

Keywords: management strategies, pigeonpea, risk factors, smallholder farmers

Procedia PDF Downloads 213
13196 Relations of Progression in Cognitive Decline with Initial EEG Resting-State Functional Network in Mild Cognitive Impairment

Authors: Chia-Feng Lu, Yuh-Jen Wang, Yu-Te Wu, Sui-Hing Yan

Abstract:

This study aimed at investigating whether the functional brain networks constructed using the initial EEG (obtained when patients first visited hospital) can be correlated with the progression of cognitive decline calculated as the changes of mini-mental state examination (MMSE) scores between the latest and initial examinations. We integrated the time–frequency cross mutual information (TFCMI) method to estimate the EEG functional connectivity between cortical regions, and the network analysis based on graph theory to investigate the organization of functional networks in aMCI. Our finding suggested that higher integrated functional network with sufficient connection strengths, dense connection between local regions, and high network efficiency in processing information at the initial stage may result in a better prognosis of the subsequent cognitive functions for aMCI. In conclusion, the functional connectivity can be a useful biomarker to assist in prediction of cognitive declines in aMCI.

Keywords: cognitive decline, functional connectivity, MCI, MMSE

Procedia PDF Downloads 383
13195 Design and Implementation of an Effective Machine Learning Approach to Crime Prediction and Prevention

Authors: Ashish Kumar, Kaptan Singh, Amit Saxena

Abstract:

Today, it is believed that crimes have the greatest impact on a person's ability to progress financially and personally. Identifying places where individuals shouldn't go is crucial for preventing crimes and is one of the key considerations. As society and technologies have advanced significantly, so have crimes and the harm they wreak. When there is a concentration of people in one place and changes happen quickly, it is even harder to prevent. Because of this, many crime prevention strategies have been embraced as a component of the development of smart cities in numerous cities. However, crimes can occur anywhere; all that is required is to identify the pattern of their occurrences, which will help to lower the crime rate. In this paper, an analysis related to crime has been done; information related to crimes is collected from all over India that can be accessed from anywhere. The purpose of this paper is to investigate the relationship between several factors and India's crime rate. The review has covered information related to every state of India and their associated regions of the period going in between 2001- 2014. However various classes of violations have a marginally unique scope over the years.

Keywords: K-nearest neighbor, random forest, decision tree, pre-processing

Procedia PDF Downloads 91
13194 Timescape-Based Panoramic View for Historic Landmarks

Authors: H. Ali, A. Whitehead

Abstract:

Providing a panoramic view of famous landmarks around the world offers artistic and historic value for historians, tourists, and researchers. Exploring the history of famous landmarks by presenting a comprehensive view of a temporal panorama merged with geographical and historical information presents a unique challenge of dealing with images that span a long period, from the 1800’s up to the present. This work presents the concept of temporal panorama through a timeline display of aligned historic and modern images for many famous landmarks. Utilization of this panorama requires a collection of hundreds of thousands of landmark images from the Internet comprised of historic images and modern images of the digital age. These images have to be classified for subset selection to keep the more suitable images that chronologically document a landmark’s history. Processing of historic images captured using older analog technology under various different capturing conditions represents a big challenge when they have to be used with modern digital images. Successful processing of historic images to prepare them for next steps of temporal panorama creation represents an active contribution in cultural heritage preservation through the fulfillment of one of UNESCO goals in preservation and displaying famous worldwide landmarks.

Keywords: cultural heritage, image registration, image subset selection, registered image similarity, temporal panorama, timescapes

Procedia PDF Downloads 165
13193 Rapid Processing Techniques Applied to Sintered Nickel Battery Technologies for Utility Scale Applications

Authors: J. D. Marinaccio, I. Mabbett, C. Glover, D. Worsley

Abstract:

Through use of novel modern/rapid processing techniques such as screen printing and Near-Infrared (NIR) radiative curing, process time for the sintering of sintered nickel plaques, applicable to alkaline nickel battery chemistries, has been drastically reduced from in excess of 200 minutes with conventional convection methods to below 2 minutes using NIR curing methods. Steps have also been taken to remove the need for forming gas as a reducing agent by implementing carbon as an in-situ reducing agent, within the ink formulation.

Keywords: batteries, energy, iron, nickel, storage

Procedia PDF Downloads 439
13192 Numerical Implementation and Testing of Fractioning Estimator Method for the Box-Counting Dimension of Fractal Objects

Authors: Abraham Terán Salcedo, Didier Samayoa Ochoa

Abstract:

This work presents a numerical implementation of a method for estimating the box-counting dimension of self-avoiding curves on a planar space, fractal objects captured on digital images; this method is named fractioning estimator. Classical methods of digital image processing, such as noise filtering, contrast manipulation, and thresholding, among others, are used in order to obtain binary images that are suitable for performing the necessary computations of the fractioning estimator. A user interface is developed for performing the image processing operations and testing the fractioning estimator on different captured images of real-life fractal objects. To analyze the results, the estimations obtained through the fractioning estimator are compared to the results obtained through other methods that are already implemented on different available software for computing and estimating the box-counting dimension.

Keywords: box-counting, digital image processing, fractal dimension, numerical method

Procedia PDF Downloads 83
13191 A Pattern Practise for Awareness Educations on Information Security: Information Security Project

Authors: Fati̇h Apaydin

Abstract:

Education technology is an area which constantly changes and creates innovations. As an inevitable part of the changing circumstances, the societies who have a tendency to the improvements keep up with these innovations by using the methods and strategies which have been designed for education technology. At this point, education technology has taken the responsibility to help the individuals improve themselves and teach the effective teaching methods by filling the airs in theoretical information, information security and the practice. The technology which comes to the core of our lives by raising the importance of it day by day and it enforced its position in computer- based environments. As a result, ‘being ready for technological innovations, improvement on computer-based talent, information, ability and attitude’ doctrines have to be given. However, it is today quite hard to deal with the security and reinforcement of this information. The information which is got illegally gives harm to society from every aspect, especially education. This study includes how and to what extent to use these innovative appliances such as computers and the factor of information security of these appliances in computer-based education. As the use of computer is constantly becoming prevalent in our country, both education and computer will never become out of date, so how computer-based education affects our lives and the study of information security for this type of education are important topics.

Keywords: computer, information security, education, technology, development

Procedia PDF Downloads 594
13190 Methods and Algorithms of Ensuring Data Privacy in AI-Based Healthcare Systems and Technologies

Authors: Omar Farshad Jeelani, Makaire Njie, Viktoriia M. Korzhuk

Abstract:

Recently, the application of AI-powered algorithms in healthcare continues to flourish. Particularly, access to healthcare information, including patient health history, diagnostic data, and PII (Personally Identifiable Information) is paramount in the delivery of efficient patient outcomes. However, as the exchange of healthcare information between patients and healthcare providers through AI-powered solutions increases, protecting a person’s information and their privacy has become even more important. Arguably, the increased adoption of healthcare AI has resulted in a significant concentration on the security risks and protection measures to the security and privacy of healthcare data, leading to escalated analyses and enforcement. Since these challenges are brought by the use of AI-based healthcare solutions to manage healthcare data, AI-based data protection measures are used to resolve the underlying problems. Consequently, this project proposes AI-powered safeguards and policies/laws to protect the privacy of healthcare data. The project presents the best-in-school techniques used to preserve the data privacy of AI-powered healthcare applications. Popular privacy-protecting methods like Federated learning, cryptographic techniques, differential privacy methods, and hybrid methods are discussed together with potential cyber threats, data security concerns, and prospects. Also, the project discusses some of the relevant data security acts/laws that govern the collection, storage, and processing of healthcare data to guarantee owners’ privacy is preserved. This inquiry discusses various gaps and uncertainties associated with healthcare AI data collection procedures and identifies potential correction/mitigation measures.

Keywords: data privacy, artificial intelligence (AI), healthcare AI, data sharing, healthcare organizations (HCOs)

Procedia PDF Downloads 93
13189 Mechanical Analysis and Characterization of Friction Stir Processed Aluminium Alloy

Authors: Jaswinder Kumar, Kulbir Singh Sandhu

Abstract:

Friction stir processing (FSP) is a solid-state surface processing technique. A single-pass FSP was performed on Aluminum alloy at combinations of different tool rotational speeds with cylindrical threaded pin profiled tool. The effect of these parameters on tribological properties was studied. The wear resistance is found to be increased from base metal to a single pass FSP sample. The results revealed that with an increase in tool rotational speed, the wear rate increases. The high heat generation causes matrix softening, which results in an increased wear rate; on the other hand, high heat generation leads to coarse grains, which also affected tribological properties. Furthermore, Microstructure results showed that FSPed alloy has a more refined grain structure as compare to the base material, which may be resulted in enhancement of hardness and resistance to wear in FSP.

Keywords: friction stir processing, aluminium alloy, microhardness, microstructure

Procedia PDF Downloads 109
13188 Bird-Adapted Filter for Avian Species and Individual Identification Systems Improvement

Authors: Ladislav Ptacek, Jan Vanek, Jan Eisner, Alexandra Pruchova, Pavel Linhart, Ludek Muller, Dana Jirotkova

Abstract:

One of the essential steps of avian song processing is signal filtering. Currently, the standard methods of filtering are the Mel Bank Filter or linear filter distribution. In this article, a new type of bank filter called the Bird-Adapted Filter is introduced; whereby the signal filtering is modifiable, based upon a new mathematical description of audiograms for particular bird species or order, which was named the Avian Audiogram Unified Equation. According to the method, filters may be deliberately distributed by frequency. The filters are more concentrated in bands of higher sensitivity where there is expected to be more information transmitted and vice versa. Further, it is demonstrated a comparison of various filters for automatic individual recognition of chiffchaff (Phylloscopus collybita). The average Equal Error Rate (EER) value for Linear bank filter was 16.23%, for Mel Bank Filter 18.71%, the Bird-Adapted Filter gave 14.29%, and Bird-Adapted Filter with 1/3 modification was 12.95%. This approach would be useful for practical use in automatic systems for avian species and individual identification. Since the Bird-Adapted Filter filtration is based on the measured audiograms of particular species or orders, selecting the distribution according to the avian vocalization provides the most precise filter distribution to date.

Keywords: avian audiogram, bird individual identification, bird song processing, bird species recognition, filter bank

Procedia PDF Downloads 387
13187 River Stage-Discharge Forecasting Based on Multiple-Gauge Strategy Using EEMD-DWT-LSSVM Approach

Authors: Farhad Alizadeh, Alireza Faregh Gharamaleki, Mojtaba Jalilzadeh, Houshang Gholami, Ali Akhoundzadeh

Abstract:

This study presented hybrid pre-processing approach along with a conceptual model to enhance the accuracy of river discharge prediction. In order to achieve this goal, Ensemble Empirical Mode Decomposition algorithm (EEMD), Discrete Wavelet Transform (DWT) and Mutual Information (MI) were employed as a hybrid pre-processing approach conjugated to Least Square Support Vector Machine (LSSVM). A conceptual strategy namely multi-station model was developed to forecast the Souris River discharge more accurately. The strategy used herein was capable of covering uncertainties and complexities of river discharge modeling. DWT and EEMD was coupled, and the feature selection was performed for decomposed sub-series using MI to be employed in multi-station model. In the proposed feature selection method, some useless sub-series were omitted to achieve better performance. Results approved efficiency of the proposed DWT-EEMD-MI approach to improve accuracy of multi-station modeling strategies.

Keywords: river stage-discharge process, LSSVM, discrete wavelet transform, Ensemble Empirical Decomposition Mode, multi-station modeling

Procedia PDF Downloads 175
13186 Improved Processing Speed for Text Watermarking Algorithm in Color Images

Authors: Hamza A. Al-Sewadi, Akram N. A. Aldakari

Abstract:

Copyright protection and ownership proof of digital multimedia are achieved nowadays by digital watermarking techniques. A text watermarking algorithm for protecting the property rights and ownership judgment of color images is proposed in this paper. Embedding is achieved by inserting texts elements randomly into the color image as noise. The YIQ image processing model is found to be faster than other image processing methods, and hence, it is adopted for the embedding process. An optional choice of encrypting the text watermark before embedding is also suggested (in case required by some applications), where, the text can is encrypted using any enciphering technique adding more difficulty to hackers. Experiments resulted in embedding speed improvement of more than double the speed of other considered systems (such as least significant bit method, and separate color code methods), and a fairly acceptable level of peak signal to noise ratio (PSNR) with low mean square error values for watermarking purposes.

Keywords: steganography, watermarking, time complexity measurements, private keys

Procedia PDF Downloads 143
13185 Information Overload, Information Literacy and Use of Technology by Students

Authors: Elena Krelja Kurelović, Jasminka Tomljanović, Vlatka Davidović

Abstract:

The development of web technologies and mobile devices makes creating, accessing, using and sharing information or communicating with each other simpler every day. However, while the amount of information constantly increasing it is becoming harder to effectively organize and find quality information despite the availability of web search engines, filtering and indexing tools. Although digital technologies have overall positive impact on students’ lives, frequent use of these technologies and digital media enriched with dynamic hypertext and hypermedia content, as well as multitasking, distractions caused by notifications, calls or messages; can decrease the attention span, make thinking, memorizing and learning more difficult, which can lead to stress and mental exhaustion. This is referred to as “information overload”, “information glut” or “information anxiety”. Objective of this study is to determine whether students show signs of information overload and to identify the possible predictors. Research was conducted using a questionnaire developed for the purpose of this study. The results show that students frequently use technology (computers, gadgets and digital media), while they show moderate level of information literacy. They have sometimes experienced symptoms of information overload. According to the statistical analysis, higher frequency of technology use and lower level of information literacy are correlated with larger information overload. The multiple regression analysis has confirmed that the combination of these two independent variables has statistically significant predictive capacity for information overload. Therefore, the information science teachers should pay attention to improving the level of students’ information literacy and educate them about the risks of excessive technology use.

Keywords: information overload, computers, mobile devices, digital media, information literacy, students

Procedia PDF Downloads 278
13184 A Review of Information Systems Development in Developing Countries

Authors: B. N. Asare, O. A. Ajigini

Abstract:

Information systems (IS) are highly important in the operation of private and public organisations in developing and developed countries. Developing countries are saddled with many project failures during the implementation of information systems. However, successful information systems are greatly needed in developing countries in order to enhance their economies. This paper is highly important in view of the high failure rate of information systems in developing countries which needs to be reduced to minimum acceptable levels by means of recommended interventions. This paper centres on a review of IS development in developing countries. The paper presents evidences of the IS successes and failures in developing countries and posits a model to address the IS failures. The proposed model can then be utilised by developing countries to reduce their IS project implementation failure rate. A comparison is drawn between IS development in developing countries and developed countries. The paper provides valuable information to assist in reducing IS failure, and developing IS models and theories on IS development for developing countries.

Keywords: developing countries, information systems, IS development, information systems failure, information systems success, information systems success model

Procedia PDF Downloads 380
13183 Bacteriophage Lysis Of Physiologically Stressed Listeria Monocytogenes In A Simulated Seafood Processing Environment

Authors: Geevika J. Ganegama Arachchi, Steve H. Flint, Lynn McIntyre, Cristina D. Cruz, Beatrice M. Dias-Wanigasekera, Craig Billington, J. Andrew Hudson, Anthony N. Mutukumira

Abstract:

In seafood processing plants, Listeriamonocytogenes(L. monocytogenes)likely exists in a metabolically stressed state due to the nutrient-deficient environment, processing treatments such as heating, curing, drying, and freezing, and exposure to detergents and disinfectants. Stressed L. monocytogenes cells have been shown to be as pathogenic as unstressed cells. This study investigated lytic efficacy of (LiMN4L, LiMN4p, and LiMN17) which were previouslycharacterized as virulent against physiologically stressed cells of three seafood borne L. monocytogenesstrains (19CO9, 19DO3, and 19EO3).Physiologically compromised cells ofL. monocytogenesstrains were prepared by aging cultures in TrypticaseSoy Broth at 15±1°C for 72 h; heat injuringcultures at 54±1 - 55±1°C for 40 - 60 min;salt-stressing cultures in Milli-Q water were incubated at 25±1°C in darkness for three weeks; and incubating cultures in 9% (w/v) NaCl at 15±1°C for 72 h. Low concentrations of physiologically compromised cells of three L. monocytogenesstrainswere challenged in vitrowith high titre of three phages in separate experiments using Fish Broth medium (aqueous fish extract) at 15 °C in order to mimic the environment of seafood processing plant. Each phage, when present at ≈9 log10 PFU/ml, reduced late exponential phase cells of L. monocytogenes suspended in fish protein broth at ≈2-3 log10 CFU/ml to a non-detectable level (< 10 CFU/ml). Each phage, when present at ≈8.5 log10 PFU/ml, reduced both heat-injured cells present at 2.5-3.6 log10 CFU/ml and starved cells that were showed coccoid shape, present at ≈2-3 log10 CFU/ml to < 10 CFU/ml after 30 min. Phages also reduced salt-stressed cellspresent at ≈3 log10 CFU/ml by > 2 log10. L. monocytogenes (≈8 log10 CFU/ml) were reduced to below the detection limit (1 CFU/ml) by the three successive phage infections over 16 h, indicating that emergence of spontaneous phage resistance was infrequent. The three virulent phages showed high decontamination potential for physiologically stressed L. monocytogenes strains from seafood processing environments.

Keywords: physiologically stressed L. monocytogenes, heat injured, seafood processing environment, virulent phage

Procedia PDF Downloads 135
13182 Predictive Analytics in Oil and Gas Industry

Authors: Suchitra Chnadrashekhar

Abstract:

Earlier looked as a support function in an organization information technology has now become a critical utility to manage their daily operations. Organizations are processing huge amount of data which was unimaginable few decades before. This has opened the opportunity for IT sector to help industries across domains to handle the data in the most intelligent manner. Presence of IT has been a leverage for the Oil & Gas industry to store, manage and process the data in most efficient way possible thus deriving the economic value in their day-to-day operations. Proper synchronization between Operational data system and Information Technology system is the need of the hour. Predictive analytics supports oil and gas companies by addressing the challenge of critical equipment performance, life cycle, integrity, security, and increase their utilization. Predictive analytics go beyond early warning by providing insights into the roots of problems. To reach their full potential, oil and gas companies need to take a holistic or systems approach towards asset optimization and thus have the functional information at all levels of the organization in order to make the right decisions. This paper discusses how the use of predictive analysis in oil and gas industry is redefining the dynamics of this sector. Also, the paper will be supported by real time data and evaluation of the data for a given oil production asset on an application tool, SAS. The reason for using SAS as an application for our analysis is that SAS provides an analytics-based framework to improve uptimes, performance and availability of crucial assets while reducing the amount of unscheduled maintenance, thus minimizing maintenance-related costs and operation disruptions. With state-of-the-art analytics and reporting, we can predict maintenance problems before they happen and determine root causes in order to update processes for future prevention.

Keywords: hydrocarbon, information technology, SAS, predictive analytics

Procedia PDF Downloads 360
13181 Limbic Involvement in Visual Processing

Authors: Deborah Zelinsky

Abstract:

The retina filters millions of incoming signals into a smaller amount of exiting optic nerve fibers that travel to different portions of the brain. Most of the signals are for eyesight (called "image-forming" signals). However, there are other faster signals that travel "elsewhere" and are not directly involved with eyesight (called "non-image-forming" signals). This article centers on the neurons of the optic nerve connecting to parts of the limbic system. Eye care providers are currently looking at parvocellular and magnocellular processing pathways without realizing that those are part of an enormous "galaxy" of all the body systems. Lenses are modifying both non-image and image-forming pathways, taking A.M. Skeffington's seminal work one step further. Almost 100 years ago, he described the Where am I (orientation), Where is It (localization), and What is It (identification) pathways. Now, among others, there is a How am I (animation) and a Who am I (inclination, motivation, imagination) pathway. Classic eye testing considers pupils and often assesses posture and motion awareness, but classical prescriptions often overlook limbic involvement in visual processing. The limbic system is composed of the hippocampus, amygdala, hypothalamus, and anterior nuclei of the thalamus. The optic nerve's limbic connections arise from the intrinsically photosensitive retinal ganglion cells (ipRGC) through the "retinohypothalamic tract" (RHT). There are two main hypothalamic nuclei with direct photic inputs. These are the suprachiasmatic nucleus and the paraventricular nucleus. Other hypothalamic nuclei connected with retinal function, including mood regulation, appetite, and glucose regulation, are the supraoptic nucleus and the arcuate nucleus. The retino-hypothalamic tract is often overlooked when we prescribe eyeglasses. Each person is different, but the lenses we choose are influencing this fast processing, which affects each patient's aiming and focusing abilities. These signals arise from the ipRGC cells that were only discovered 20+ years ago and do not address the campana retinal interneurons that were only discovered 2 years ago. As eyecare providers, we are unknowingly altering such factors as lymph flow, glucose metabolism, appetite, and sleep cycles in our patients. It is important to know what we are prescribing as the visual processing evaluations expand past the 20/20 central eyesight.

Keywords: neuromodulation, retinal processing, retinohypothalamic tract, limbic system, visual processing

Procedia PDF Downloads 85
13180 Continuous Processing Approaches for Tunable Asymmetric Photochemical Synthesis

Authors: Amanda C. Evans

Abstract:

Enabling technologies such as continuous processing (CP) approaches can provide the tools needed to control and manipulate reactivities and transform chemical reactions into micro-controlled in-flow processes. Traditional synthetic approaches can be radically transformed by the application of CP, facilitating the pairing of chemical methodologies with technologies from other disciplines. CP supports sustainable processes that controllably generate reaction specificity utilizing supramolecular interactions. Continuous photochemical processing is an emerging field of investigation. The use of light to drive chemical reactivity is not novel, but the controlled use of specific and tunable wavelengths of light to selectively generate molecular structure under continuous processing conditions is an innovative approach towards chemical synthesis. This investigation focuses on the use of circularly polarized (cp) light as a sustainable catalyst for the CP generation of asymmetric molecules. Chiral photolysis has already been achieved under batch, solid-phase conditions: using synchrotron-sourced cp light, asymmetric photolytic selectivities of up to 4.2% enantiomeric excess (e.e.) have been reported. In order to determine the optimal wavelengths to use for irradiation with cp light for any given molecular building block, CD and anisotropy spectra for each building block of interest have been generated in two different solvents (water, hexafluoroisopropanol) across a range of wavelengths (130-400 nm). These spectra are being used to support a series of CP experiments using cp light to generate enantioselectivity.

Keywords: anisotropy, asymmetry, flow chemistry, active pharmaceutical ingredients

Procedia PDF Downloads 157
13179 Large-Scale Simulations of Turbulence Using Discontinuous Spectral Element Method

Authors: A. Peyvan, D. Li, J. Komperda, F. Mashayek

Abstract:

Turbulence can be observed in a variety fluid motions in nature and industrial applications. Recent investment in high-speed aircraft and propulsion systems has revitalized fundamental research on turbulent flows. In these systems, capturing chaotic fluid structures with different length and time scales is accomplished through the Direct Numerical Simulation (DNS) approach since it accurately simulates flows down to smallest dissipative scales, i.e., Kolmogorov’s scales. The discontinuous spectral element method (DSEM) is a high-order technique that uses spectral functions for approximating the solution. The DSEM code has been developed by our research group over the course of more than two decades. Recently, the code has been improved to run large cases in the order of billions of solution points. Running big simulations requires a considerable amount of RAM. Therefore, the DSEM code must be highly parallelized and able to start on multiple computational nodes on an HPC cluster with distributed memory. However, some pre-processing procedures, such as determining global element information, creating a global face list, and assigning global partitioning and element connection information of the domain for communication, must be done sequentially with a single processing core. A separate code has been written to perform the pre-processing procedures on a local machine. It stores the minimum amount of information that is required for the DSEM code to start in parallel, extracted from the mesh file, into text files (pre-files). It packs integer type information with a Stream Binary format in pre-files that are portable between machines. The files are generated to ensure fast read performance on different file-systems, such as Lustre and General Parallel File System (GPFS). A new subroutine has been added to the DSEM code to read the startup files using parallel MPI I/O, for Lustre, in a way that each MPI rank acquires its information from the file in parallel. In case of GPFS, in each computational node, a single MPI rank reads data from the file, which is specifically generated for the computational node, and send them to other ranks on the node using point to point non-blocking MPI communication. This way, communication takes place locally on each node and signals do not cross the switches of the cluster. The read subroutine has been tested on Argonne National Laboratory’s Mira (GPFS), National Center for Supercomputing Application’s Blue Waters (Lustre), San Diego Supercomputer Center’s Comet (Lustre), and UIC’s Extreme (Lustre). The tests showed that one file per node is suited for GPFS and parallel MPI I/O is the best choice for Lustre file system. The DSEM code relies on heavily optimized linear algebra operation such as matrix-matrix and matrix-vector products for calculation of the solution in every time-step. For this, the code can either make use of its matrix math library, BLAS, Intel MKL, or ATLAS. This fact and the discontinuous nature of the method makes the DSEM code run efficiently in parallel. The results of weak scaling tests performed on Blue Waters showed a scalable and efficient performance of the code in parallel computing.

Keywords: computational fluid dynamics, direct numerical simulation, spectral element, turbulent flow

Procedia PDF Downloads 133
13178 Major Constraints to Adoption of Improved Post-harvest Technologies among Smallholder Farmers in Developing Countries: A Systematic Review

Authors: Muganyizi Jonas Bisheko, G. Rejikumar

Abstract:

Reducing post-harvest losses could be a sustainable solution to enhance the food and income security of smallholder farmers in developing countries. While various research institutions have come up with a number of innovative post-harvest technologies for reducing post-harvest losses, most of them have not been extensively adopted by smallholder farmers. Despite this gap, the synthesized information about the major constraints of post-harvest technology is scarce. This study has been conducted to fill this gap and show the implications of the findings for future post-harvest research. The developed search strategy retrieved 2201 studies. However, after excluding duplicates, title, abstract and full article screening, a total of 41 documents were identified. The major findings are: (i) there is an outstanding deficiency of systematic evidence of the effect of climate change, off-farm income and sources of post-harvest information on the adoption of improved post-harvest technologies; (ii) there is very limited information on adoption constraints pertaining to matters of policy, rules and regulations; (iii) there is very thin literature on behavioral constraints associated with limited adoption of improved post-harvest technologies; (iv) most of the studies focused on post-harvest storage technologies (47%) followed by overall post-harvest management practices (25%), processing technologies (19%) and packaging technologies (3%). Much of the information was found on Cereals (58%), especially maize (44%); (v) geographically, Sub-Saharan Africa accounted for 79% of the reviewed interventions, while South Asia occupied only 21%. The findings of this review are intended to guide various post-harvest technologists and decision-makers in addressing the challenge of huge post-harvest losses.

Keywords: constraints, post-harvest loss, post-harvest technology , smallholder farmer

Procedia PDF Downloads 235
13177 Enterprise Security Architecture: Approaches and a Framework

Authors: Amir Mohtarami, Hadi Kandjani

Abstract:

The amount of business-critical information in enterprises is growing at an extraordinary rate, and the ability to catalog that information and properly protect it using traditional security mechanisms is not keeping pace. Alongside the Information Technology (IT), information security needs a holistic view in enterprise. In other words, a comprehensive architectural approach is required, focusing on the information itself, understanding what the data are, who owns it, and which business and regulatory policies should be applied to the information. Enterprise Architecture Frameworks provide useful tools to grasp different dimensions of IT in organizations. Usually this is done by the layered views on IT architecture, but not requisite security attention has been held in this frameworks. In this paper, after a brief look at the Enterprise Architecture (EA), we discuss the issue of security in the overall enterprise IT architecture. Due to the increasing importance of security, a rigorous EA program in an enterprise should be able to consider security architecture as an integral part of its processes and gives a visible roadmap and blueprint for this aim.

Keywords: enterprise architecture, architecture framework, security architecture, information systems

Procedia PDF Downloads 704
13176 Standardized Description and Modeling Methods of Semiconductor IP Interfaces

Authors: Seongsoo Lee

Abstract:

IP reuse is an effective design methodology for modern SoC design to reduce effort and time. However, description and modeling methods of IP interfaces are different due to different IP designers. In this paper, standardized description and modeling methods of IP interfaces are proposed. It consists of 11 items such as IP information, model provision, data type, description level, interface information, port information, signal information, protocol information, modeling level, modeling information, and source file. The proposed description and modeling methods enables easy understanding, simulation, verification, and modification in IP reuse.

Keywords: interface, standardization, description, modeling, semiconductor IP

Procedia PDF Downloads 502
13175 Algorithm for Path Recognition in-between Tree Rows for Agricultural Wheeled-Mobile Robots

Authors: Anderson Rocha, Pedro Miguel de Figueiredo Dinis Oliveira Gaspar

Abstract:

Machine vision has been widely used in recent years in agriculture, as a tool to promote the automation of processes and increase the levels of productivity. The aim of this work is the development of a path recognition algorithm based on image processing to guide a terrestrial robot in-between tree rows. The proposed algorithm was developed using the software MATLAB, and it uses several image processing operations, such as threshold detection, morphological erosion, histogram equalization and the Hough transform, to find edge lines along tree rows on an image and to create a path to be followed by a mobile robot. To develop the algorithm, a set of images of different types of orchards was used, which made possible the construction of a method capable of identifying paths between trees of different heights and aspects. The algorithm was evaluated using several images with different characteristics of quality and the results showed that the proposed method can successfully detect a path in different types of environments.

Keywords: agricultural mobile robot, image processing, path recognition, hough transform

Procedia PDF Downloads 146
13174 Improving Subjective Bias Detection Using Bidirectional Encoder Representations from Transformers and Bidirectional Long Short-Term Memory

Authors: Ebipatei Victoria Tunyan, T. A. Cao, Cheol Young Ock

Abstract:

Detecting subjectively biased statements is a vital task. This is because this kind of bias, when present in the text or other forms of information dissemination media such as news, social media, scientific texts, and encyclopedias, can weaken trust in the information and stir conflicts amongst consumers. Subjective bias detection is also critical for many Natural Language Processing (NLP) tasks like sentiment analysis, opinion identification, and bias neutralization. Having a system that can adequately detect subjectivity in text will boost research in the above-mentioned areas significantly. It can also come in handy for platforms like Wikipedia, where the use of neutral language is of importance. The goal of this work is to identify the subjectively biased language in text on a sentence level. With machine learning, we can solve complex AI problems, making it a good fit for the problem of subjective bias detection. A key step in this approach is to train a classifier based on BERT (Bidirectional Encoder Representations from Transformers) as upstream model. BERT by itself can be used as a classifier; however, in this study, we use BERT as data preprocessor as well as an embedding generator for a Bi-LSTM (Bidirectional Long Short-Term Memory) network incorporated with attention mechanism. This approach produces a deeper and better classifier. We evaluate the effectiveness of our model using the Wiki Neutrality Corpus (WNC), which was compiled from Wikipedia edits that removed various biased instances from sentences as a benchmark dataset, with which we also compare our model to existing approaches. Experimental analysis indicates an improved performance, as our model achieved state-of-the-art accuracy in detecting subjective bias. This study focuses on the English language, but the model can be fine-tuned to accommodate other languages.

Keywords: subjective bias detection, machine learning, BERT–BiLSTM–Attention, text classification, natural language processing

Procedia PDF Downloads 130
13173 Selection of Pichia kudriavzevii Strain for the Production of Single-Cell Protein from Cassava Processing Waste

Authors: Phakamas Rachamontree, Theerawut Phusantisampan, Natthakorn Woravutthikul, Peerapong Pornwongthong, Malinee Sriariyanun

Abstract:

A total of 115 yeast strains isolated from local cassava processing wastes were measured for crude protein content. Among these strains, the strain MSY-2 possessed the highest protein concentration (>3.5 mg protein/mL). By using molecular identification tools, it was identified to be a strain of Pichia kudriavzevii based on similarity of D1/D2 domain of 26S rDNA region. In this study, to optimize the protein production by MSY-2 strain, Response Surface Methodology (RSM) was applied. The tested parameters were the carbon content, nitrogen content, and incubation time. Here, the value of regression coefficient (R2) = 0.7194 could be explained by the model, which is high to support the significance of the model. Under the optimal condition, the protein content was produced up to 3.77 g per L of the culture and MSY-2 strain contain 66.8 g protein per 100 g of cell dry weight. These results revealed the plausibility of applying the novel strain of yeast in single-cell protein production.

Keywords: single cell protein, response surface methodology, yeast, cassava processing waste

Procedia PDF Downloads 403
13172 Application of Advanced Remote Sensing Data in Mineral Exploration in the Vicinity of Heavy Dense Forest Cover Area of Jharkhand and Odisha State Mining Area

Authors: Hemant Kumar, R. N. K. Sharma, A. P. Krishna

Abstract:

The study has been carried out on the Saranda in Jharkhand and a part of Odisha state. Geospatial data of Hyperion, a remote sensing satellite, have been used. This study has used a wide variety of patterns related to image processing to enhance and extract the mining class of Fe and Mn ores.Landsat-8, OLI sensor data have also been used to correctly explore related minerals. In this way, various processes have been applied to increase the mineralogy class and comparative evaluation with related frequency done. The Hyperion dataset for hyperspectral remote sensing has been specifically verified as an effective tool for mineral or rock information extraction within the band range of shortwave infrared used. The abundant spatial and spectral information contained in hyperspectral images enables the differentiation of different objects of any object into targeted applications for exploration such as exploration detection, mining.

Keywords: Hyperion, hyperspectral, sensor, Landsat-8

Procedia PDF Downloads 123
13171 Online Learning Versus Face to Face Learning: A Sentiment Analysis on General Education Mathematics in the Modern World of University of San Carlos School of Arts and Sciences Students Using Natural Language Processing

Authors: Derek Brandon G. Yu, Clyde Vincent O. Pilapil, Christine F. Peña

Abstract:

College students of Cebu province have been indoors since March 2020, and a challenge encountered is the sudden shift from face to face to online learning and with the lack of empirical data on online learning on Higher Education Institutions (HEIs) in the Philippines. Sentiments on face to face and online learning will be collected from University of San Carlos (USC), School of Arts and Sciences (SAS) students regarding Mathematics in the Modern World (MMW), a General Education (GE) course. Natural Language Processing with machine learning algorithms will be used to classify the sentiments of the students. Results of the research study are the themes identified through topic modelling and the overall sentiments of the students in USC SAS

Keywords: natural language processing, online learning, sentiment analysis, topic modelling

Procedia PDF Downloads 246
13170 Disparity of Learning Styles and Cognitive Abilities in Vocational Education

Authors: Mimi Mohaffyza Mohamad, Yee Mei Heong, Nurfirdawati Muhammad Hanafi, Tee Tze Kiong

Abstract:

This study is conducted to investigate the disparity of between learning styles and cognitive abilities specifically in Vocational Education. Felder and Silverman Learning Styles Model (FSLSM) was applied to measure the students’ learning styles while the content in Building Construction Subject consists; knowledge, skills and problem solving were taken into account in constructing the elements of cognitive abilities. There are four dimension of learning styles proposed by Felder and Silverman intended to capture student learning preferences with regards to processing either active or reflective, perception based on sensing or intuitive, input of information used visual or verbal and understanding information represent with sequential or global learner. The study discovered that students are tending to be visual learners and each type of learner having significant difference whereas cognitive abilities. The finding may help teachers to facilitate students more effectively and to boost the student’s cognitive abilities.

Keywords: learning styles, cognitive abilities, dimension of learning styles, learning preferences

Procedia PDF Downloads 402
13169 Agents and Causers in the Experiencer-Verb Lexicon

Authors: Margaret Ryan, Linda Cupples, Lyndsey Nickels, Paul Sowman

Abstract:

The current investigation explored the thematic roles of the nouns specified in the lexical entries of experiencer verbs. While prior experimental research assumes experiencer and theme roles for both subject-experiencer (SE) and object-experiencer (OE) verbs, syntactic theorists have posited additional agent and causer roles. Experiment 1 provided evidence for an agent as participants assigned a high degree of intentionality to the logical subject of a subset of SE and OE actives and passives. Experiment 2 provided evidence for a causer as participants assigned high levels of causality to the logical subjects of experiencer sentences generally. However, the presence of an agent, but not a causer, coincided with processing ease. Causality may be an aspect rather than a thematic role. The varying thematic roles amongst experiencer-verb sentences have important implications for stimulus selection because we cannot presume processing is similar across differing sentence subtypes.

Keywords: sentence comprehension, lexicon, canonicity, processing, thematic roles, syntax

Procedia PDF Downloads 124