Search results for: hierarchical text classification models
5283 Sorghum Grains Grading for Food, Feed, and Fuel Using NIR Spectroscopy
Authors: Irsa Ejaz, Siyang He, Wei Li, Naiyue Hu, Chaochen Tang, Songbo Li, Meng Li, Boubacar Diallo, Guanghui Xie, Kang Yu
Abstract:
Background: Near-infrared spectroscopy (NIR) is a non-destructive, fast, and low-cost method to measure the grain quality of different cereals. Previously reported NIR model calibrations using the whole grain spectra had moderate accuracy. Improved predictions are achievable by using the spectra of whole grains, when compared with the use of spectra collected from the flour samples. However, the feasibility for determining the critical biochemicals, related to the classifications for food, feed, and fuel products are not adequately investigated. Objectives: To evaluate the feasibility of using NIRS and the influence of four sample types (whole grains, flours, hulled grain flours, and hull-less grain flours) on the prediction of chemical components to improve the grain sorting efficiency for human food, animal feed, and biofuel. Methods: NIR was applied in this study to determine the eight biochemicals in four types of sorghum samples: hulled grain flours, hull-less grain flours, whole grains, and grain flours. A total of 20 hybrids of sorghum grains were selected from the two locations in China. Followed by NIR spectral and wet-chemically measured biochemical data, partial least squares regression (PLSR) was used to construct the prediction models. Results: The results showed that sorghum grain morphology and sample format affected the prediction of biochemicals. Using NIR data of grain flours generally improved the prediction compared with the use of NIR data of whole grains. In addition, using the spectra of whole grains enabled comparable predictions, which are recommended when a non-destructive and rapid analysis is required. Compared with the hulled grain flours, hull-less grain flours allowed for improved predictions for tannin, cellulose, and hemicellulose using NIR data. Conclusion: The established PLSR models could enable food, feed, and fuel producers to efficiently evaluate a large number of samples by predicting the required biochemical components in sorghum grains without destruction.Keywords: FT-NIR, sorghum grains, biochemical composition, food, feed, fuel, PLSR
Procedia PDF Downloads 695282 The Use of Information and Communication Technologies in Electoral Procedures: Comments on Electronic Voting Security
Authors: Magdalena Musiał-Karg
Abstract:
The expansion of telecommunication and progress of electronic media constitute important elements of our times. The recent worldwide convergence of information and communication technologies (ICT) and dynamic development of the mass media is leading to noticeable changes in the functioning of contemporary states and societies. Currently, modern technologies play more and more important roles and filter down to almost every field of contemporary human life. It results in the growth of online interactions that can be observed by the inconceivable increase in the number of people with home PCs and Internet access. The proof of it is undoubtedly the emergence and use of concepts such as e-society, e-banking, e-services, e-government, e-government, e-participation and e-democracy. The newly coined word e-democracy evidences that modern technologies have also been widely used in politics. Without any doubt in most countries all actors of political market (politicians, political parties, servants in political/public sector, media) use modern forms of communication with the society. Most of these modern technologies progress the processes of getting and sending information to the citizens, communication with the electorate, and also – which seems to be the biggest advantage – electoral procedures. Thanks to implementation of ICT the interaction between politicians and electorate are improved. The main goal of this text is to analyze electronic voting (e-voting) as one of the important forms of electronic democracy in terms of security aspects. The author of this paper aimed at answering the questions of security of electronic voting as an additional form of participation in elections and referenda.Keywords: electronic democracy, electronic voting, security of e-voting, information and communication technology (ICT)
Procedia PDF Downloads 2415281 An Integreated Intuitionistic Fuzzy ELECTRE Model for Multi-Criteria Decision-Making
Authors: Babek Erdebilli
Abstract:
The aim of this study is to develop and describe a new methodology for the Multi-Criteria Decision-Making (MCDM) problem using IFE (Elimination Et Choix Traduisant La Realite (ELECTRE) model. The proposed models enable Decision-Makers (DMs) on the assessment and use Intuitionistic Fuzzy Numbers (IFN). A numerical example is provided to demonstrate and clarify the proposed analysis procedure. Also, an empirical experiment is conducted to validation the effectiveness.Keywords: multi-criteria decision-making, IFE, DM’s, fuzzy electre model
Procedia PDF Downloads 6515280 Studies on Non-Isothermal Crystallization Kinetics of PP/SEBS-g-MA Blends
Authors: Rishi Sharma, S. N. Maiti
Abstract:
The non-isothermal crystallization kinetics of PP/SEBS-g-MA blends up to 0-50% concentration of copolymer was studied by differential scanning calorimetry at four different cooling rates. Crystallization parameters were analyzed by Avrami and Jeziorny models. Primary and secondary crystallization processes were described by Avrami equation. Avrami model showed that all types of shapes grow from small dimensions during primary crystallization. However, three-dimensional crystal growth was observed during the secondary crystallization process. The crystallization peak and onset temperature decrease, howeverKeywords: crystallization kinetics, non-isothermal, polypropylene, SEBS-g-MA
Procedia PDF Downloads 6225279 Micro-Droplet Formation in a Microchannel under the Effect of an Electric Field: Experiment
Authors: Sercan Altundemir, Pinar Eribol, A. Kerem Uguz
Abstract:
Microfluidics systems allow many-large scale laboratory applications to be miniaturized on a single device in order to reduce cost and advance fluid control. Moreover, such systems enable to generate and control droplets which have a significant role on improved analysis for many chemical and biological applications. For example, they can be employed as the model for cells in microfluidic systems. In this work, the interfacial instability of two immiscible Newtonian liquids flowing in a microchannel is investigated. When two immiscible liquids are in laminar regime, a flat interface is formed between them. If a direct current electric field is applied, the interface may deform, i.e. may become unstable and it may be ruptured and form micro-droplets. First, the effect of thickness ratio, total flow rate, viscosity ratio of the silicone oil and ethylene glycol liquid couple on the critical voltage at which the interface starts to destabilize is investigated. Then the droplet sizes are measured under the effect of these parameters at various voltages. Moreover, the effect of total flow rate on the time elapsed for the interface to be ruptured to form droplets by hitting the wall of the channel is analyzed. It is observed that an increase in the viscosity or the thickness ratio of the silicone oil to the ethylene glycol has a stabilizing effect, i.e. a higher voltage is needed while the total flow rate has no effect on it. However, it is observed that an increase in the total flow rate results in shortening of the elapsed time for the interface to hit the wall. Moreover, the droplet size decreases down to 0.1 μL with an increase in the applied voltage, the viscosity ratio or the total flow rate or a decrease in the thickness ratio. In addition to these observations, two empirical models for determining the critical electric number, i.e., the dimensionless voltage and the droplet size and another model which is a combination of both models, for determining the droplet size at the critical voltage are established.Keywords: droplet formation, electrohydrodynamics, microfluidics, two-phase flow
Procedia PDF Downloads 1765278 Machine Learning in Agriculture: A Brief Review
Authors: Aishi Kundu, Elhan Raza
Abstract:
"Necessity is the mother of invention" - Rapid increase in the global human population has directed the agricultural domain toward machine learning. The basic need of human beings is considered to be food which can be satisfied through farming. Farming is one of the major revenue generators for the Indian economy. Agriculture is not only considered a source of employment but also fulfils humans’ basic needs. So, agriculture is considered to be the source of employment and a pillar of the economy in developing countries like India. This paper provides a brief review of the progress made in implementing Machine Learning in the agricultural sector. Accurate predictions are necessary at the right time to boost production and to aid the timely and systematic distribution of agricultural commodities to make their availability in the market faster and more effective. This paper includes a thorough analysis of various machine learning algorithms applied in different aspects of agriculture (crop management, soil management, water management, yield tracking, livestock management, etc.).Due to climate changes, crop production is affected. Machine learning can analyse the changing patterns and come up with a suitable approach to minimize loss and maximize yield. Machine Learning algorithms/ models (regression, support vector machines, bayesian models, artificial neural networks, decision trees, etc.) are used in smart agriculture to analyze and predict specific outcomes which can be vital in increasing the productivity of the Agricultural Food Industry. It is to demonstrate vividly agricultural works under machine learning to sensor data. Machine Learning is the ongoing technology benefitting farmers to improve gains in agriculture and minimize losses. This paper discusses how the irrigation and farming management systems evolve in real-time efficiently. Artificial Intelligence (AI) enabled programs to emerge with rich apprehension for the support of farmers with an immense examination of data.Keywords: machine Learning, artificial intelligence, crop management, precision farming, smart farming, pre-harvesting, harvesting, post-harvesting
Procedia PDF Downloads 1055277 The Reception of Disclosure of Sexual Teens in Media
Authors: Rizky Kertanegara
Abstract:
Reception studies is one of the cultural studies lately evolved in the realm of communication science. This qualitative study was pioneered by Stuart Hall who initiated the dominant, negotiation, and opposition of audience reading to the text of the media. In its development, this reception studies is developed by Kim Christian Schroder become multidimensional reception studies. In this update, Schroder aware that there has been a bias between readings made by the informant with readings conducted by researchers over the informant. Therefore, he classifies the reception into two dimensions, namely the dimension of reading by informants and implications dimensions conducted by researcher. Using Schroder approach, these studies seek to describe the reception of adolescent girls, as research subjects, to the elements contained sexual openness in the music video Cinta Laura as the object of research. Researcher wanted to see how they interpret the values of Western culture based on the values of their culture as a teenager. Researchers used a descriptive qualitative research method by conducting in-depth interviews to the informants who comes from a religious school. The selection of informants was done by using purposeful sampling. Collaboration with the school, the researchers were able to select informants who could provide rich data related to the topic. The analysis showed that there is permissiveness informants in addressing sexual openness in the music video. In addition, informants from Catholic schools were more open than the informant derived from Islamic schools in accepting the values of sexual openness. This permisiveness is regarded as a form of self-actualization and gender equality.Keywords: cultural studies, multidimensional reception model, sexual openness, youth audience
Procedia PDF Downloads 4155276 Intergenerational Trauma: Patterns of Child Abuse and Neglect Across Two Generations in a Barbados Cohort
Authors: Rebecca S. Hock, Cyralene P. Bryce, Kevin Williams, Arielle G. Rabinowitz, Janina R. Galler
Abstract:
Background: Findings have been mixed regarding whether offspring of parents who were abused or neglected as children have a greater risk of experiencing abuse or neglect themselves. In addition, many studies on this topic are restricted to physical abuse and take place in a limited number of countries, representing a small segment of the world's population. Methods: We examined relationships between childhood maltreatment history assessed in a subset (N=68) of the original longitudinal birth cohort (G1) of the Barbados Nutrition Study and their now-adult offspring (G2) (N=111) using the Childhood Trauma Questionnaire-Short Form (CTQ-SF). We used Pearson correlations to assess relationships between parent and offspring CTQ-SF total and subscale scores (physical, emotional, and sexual abuse; physical and emotional neglect). Next, we ran multiple regression analyses, using the parental CTQ-SF total score and the parental Sexual Abuse score as primary predictors separately in our models of G2 CTQ-SF (total and subscale scores). Results: G1 total CTQ-SF scores were correlated with G2 offspring Emotional Neglect and total scores. G1 Sexual Abuse history was significantly correlated with G2 Emotional Abuse, Sexual Abuse, Emotional Neglect, and Total Score. In fully-adjusted regression models, parental (G1) total CTQ-SF scores remained significantly associated with G2 offspring reports of Emotional Neglect, and parental (G1) Sexual Abuse was associated with offspring (G2) reports of Emotional Abuse, Physical Abuse, Emotional Neglect, and overall CTQ-SF scores. Conclusions: Our findings support a link between parental exposure to childhood maltreatment and their offspring's self-reported exposure to childhood maltreatment. Of note, there was not an exact correspondence between the subcategory of maltreatment experienced from one generation to the next. Compared with other subcategories, G1 Sexual Abuse history was the most likely to predict G2 offspring maltreatment. Further studies are needed to delineate underlying mechanisms and to develop intervention strategies aimed at preventing intergenerational transmission.Keywords: trauma, family, adolescents, intergenerational trauma, child abuse, child neglect, global mental health, North America
Procedia PDF Downloads 845275 The Applicability of General Catholic Canon Law during the Ongoing Migration Crisis in Hungary
Authors: Lorand Ujhazi
Abstract:
The vast majority of existing canonical studies about migration are focused on examining the general pastoral and legal regulations of the Catholic Church. The weakness of this approach is that it ignores a number of important factors; like the financial, legal and personal circumstances of a particular church or the canonical position of certain organizations which actually look after the immigrants. This paper is a case study, which analyses the current and historical migration related policies and activities of the Catholic Church in Hungary. To achieve this goal the study uses canon law, historical publications, various instructions and communications issued by church superiors, Hungarian and foreign media reports and the relevant Hungarian legislation. The paper first examines how the Hungarian Catholic Church assisted migrants like Armenians fleeing from the Ottoman Empire, Poles escaping during the Second World War, East German and Romanian citizens in the 1980s and refugees from the former Yugoslavia in the 1990s. These events underline the importance of past historical experience in the development of contemporary pastoral and humanitarian policy of the Catholic Church in Hungary. Then the paper turns to the events of the ongoing crisis by describing the unique challenges faced by churches in transit countries like Hungary. Then the research contrasts these findings with the typical responsibilities of churches in countries which are popular destinations for immigrants. The next part of the case study focuses on the changes to the pre-crisis legal and canonical framework which influenced the actions of hierarchical and charity organizations in Hungary. Afterwards, the paper illustrates the dangers of operating in an unclear legal environment, where some charitable activities of the church like a fundraising campaign may be interpreted as a national security risk by state authorities. Then the paper presents the reactions of Hungarian academics to the current migration crisis and finally it offers some proposals how to improve parts of Canon Law which govern immigration. The conclusion of the paper is that during the formulation of the central refugee policy of the Catholic Church decision makers must take into consideration the peculiar circumstances of its particular churches. This approach may prevent disharmony between the existing central regulations, the policy of the Vatican and the operations of the local church organizations.Keywords: canon law, Catholic Church, civil law, Hungary, immigration, national security
Procedia PDF Downloads 3085274 Diagnosis of Alzheimer Diseases in Early Step Using Support Vector Machine (SVM)
Authors: Amira Ben Rabeh, Faouzi Benzarti, Hamid Amiri, Mouna Bouaziz
Abstract:
Alzheimer is a disease that affects the brain. It causes degeneration of nerve cells (neurons) and in particular cells involved in memory and intellectual functions. Early diagnosis of Alzheimer Diseases (AD) raises ethical questions, since there is, at present, no cure to offer to patients and medicines from therapeutic trials appear to slow the progression of the disease as moderate, accompanying side effects sometimes severe. In this context, analysis of medical images became, for clinical applications, an essential tool because it provides effective assistance both at diagnosis therapeutic follow-up. Computer Assisted Diagnostic systems (CAD) is one of the possible solutions to efficiently manage these images. In our work; we proposed an application to detect Alzheimer’s diseases. For detecting the disease in early stage we used the three sections: frontal to extract the Hippocampus (H), Sagittal to analysis the Corpus Callosum (CC) and axial to work with the variation features of the Cortex(C). Our method of classification is based on Support Vector Machine (SVM). The proposed system yields a 90.66% accuracy in the early diagnosis of the AD.Keywords: Alzheimer Diseases (AD), Computer Assisted Diagnostic(CAD), hippocampus, Corpus Callosum (CC), cortex, Support Vector Machine (SVM)
Procedia PDF Downloads 3855273 Analysing Time Series for a Forecasting Model to the Dynamics of Aedes Aegypti Population Size
Authors: Flavia Cordeiro, Fabio Silva, Alvaro Eiras, Jose Luiz Acebal
Abstract:
Aedes aegypti is present in the tropical and subtropical regions of the world and is a vector of several diseases such as dengue fever, yellow fever, chikungunya, zika etc. The growth in the number of arboviruses cases in the last decades became a matter of great concern worldwide. Meteorological factors like mean temperature and precipitation are known to influence the infestation by the species through effects on physiology and ecology, altering the fecundity, mortality, lifespan, dispersion behaviour and abundance of the vector. Models able to describe the dynamics of the vector population size should then take into account the meteorological variables. The relationship between meteorological factors and the population dynamics of Ae. aegypti adult females are studied to provide a good set of predictors to model the dynamics of the mosquito population size. The time-series data of capture of adult females of a public health surveillance program from the city of Lavras, MG, Brazil had its association with precipitation, humidity and temperature analysed through a set of statistical methods for time series analysis commonly adopted in Signal Processing, Information Theory and Neuroscience. Cross-correlation, multicollinearity test and whitened cross-correlation were applied to determine in which time lags would occur the influence of meteorological variables on the dynamics of the mosquito abundance. Among the findings, the studied case indicated strong collinearity between humidity and precipitation, and precipitation was selected to form a pair of descriptors together with temperature. In the techniques used, there were observed significant associations between infestation indicators and both temperature and precipitation in short, mid and long terms, evincing that those variables should be considered in entomological models and as public health indicators. A descriptive model used to test the results exhibits a strong correlation to data.Keywords: Aedes aegypti, cross-correlation, multicollinearity, meteorological variables
Procedia PDF Downloads 1805272 International Humanitarian Law and the Challenges of New Technologies of Warfare
Authors: Uche A. Nnawulezi
Abstract:
Undoubtedly, despite all efforts made to achieve overall peace through the application of the principles of international humanitarian law, crimes against mankind which are of unprecedented concern to the whole world have remained unabated. The fall back on war as a technique for settling disputes between nations, individuals, countries and ethnic groups with accompanying toll of deaths and destruction of properties have remained a conspicuous component of human history. Indeed, to control this conduct of warfare and the dehumanization of individuals, a body of law aimed at regulating the impacts of conflicts and hostilities in the theater of war has become necessary. Thus, it is to examine the conditions in which international humanitarian law will apply and also to determine the extent of the challenges of new progressions of warfare that this study is undertaken. All through this examination, we grasped doctrinal approach wherein we used text books, journals, international materials and supposition of law specialists in the field of international humanitarian law. This paper shall examine the distinctive factors responsible for the rebelliousness to the rules of International Humanitarian Law and furthermore, shall proffer possible courses of action that will address the challenges of new technologies of warfare all over the world. Essentially, the basic proposals made in this paper if totally utilized may go far in ensuring a sufficient standard in the application of the rules of international humanitarian law as it relates to an increasingly frequent phenomenon of contemporary developments in technologies of warfare which has in recent past, made it more difficult for the most ideal application of the rules of international humanitarian law. This paper deduces that for a sustainable global peace to be achieved, the rules of International Humanitarian Law as it relates to the utilization of new technologies of warfare should be completely clung to and should be made a strict liability offense. Likewise, this paper further recommends the introduction of domestic criminal law punishment of serious contraventions of the rules of international humanitarian law.Keywords: international, humanitarian law, new technologies, warfare
Procedia PDF Downloads 3045271 Supervised/Unsupervised Mahalanobis Algorithm for Improving Performance for Cyberattack Detection over Communications Networks
Authors: Radhika Ranjan Roy
Abstract:
Deployment of machine learning (ML)/deep learning (DL) algorithms for cyberattack detection in operational communications networks (wireless and/or wire-line) is being delayed because of low-performance parameters (e.g., recall, precision, and f₁-score). If datasets become imbalanced, which is the usual case for communications networks, the performance tends to become worse. Complexities in handling reducing dimensions of the feature sets for increasing performance are also a huge problem. Mahalanobis algorithms have been widely applied in scientific research because Mahalanobis distance metric learning is a successful framework. In this paper, we have investigated the Mahalanobis binary classifier algorithm for increasing cyberattack detection performance over communications networks as a proof of concept. We have also found that high-dimensional information in intermediate features that are not utilized as much for classification tasks in ML/DL algorithms are the main contributor to the state-of-the-art of improved performance of the Mahalanobis method, even for imbalanced and sparse datasets. With no feature reduction, MD offers uniform results for precision, recall, and f₁-score for unbalanced and sparse NSL-KDD datasets.Keywords: Mahalanobis distance, machine learning, deep learning, NS-KDD, local intrinsic dimensionality, chi-square, positive semi-definite, area under the curve
Procedia PDF Downloads 785270 Evaluation of Features Extraction Algorithms for a Real-Time Isolated Word Recognition System
Authors: Tomyslav Sledevič, Artūras Serackis, Gintautas Tamulevičius, Dalius Navakauskas
Abstract:
This paper presents a comparative evaluation of features extraction algorithm for a real-time isolated word recognition system based on FPGA. The Mel-frequency cepstral, linear frequency cepstral, linear predictive and their cepstral coefficients were implemented in hardware/software design. The proposed system was investigated in the speaker-dependent mode for 100 different Lithuanian words. The robustness of features extraction algorithms was tested recognizing the speech records at different signals to noise rates. The experiments on clean records show highest accuracy for Mel-frequency cepstral and linear frequency cepstral coefficients. For records with 15 dB signal to noise rate the linear predictive cepstral coefficients give best result. The hard and soft part of the system is clocked on 50 MHz and 100 MHz accordingly. For the classification purpose, the pipelined dynamic time warping core was implemented. The proposed word recognition system satisfies the real-time requirements and is suitable for applications in embedded systems.Keywords: isolated word recognition, features extraction, MFCC, LFCC, LPCC, LPC, FPGA, DTW
Procedia PDF Downloads 4965269 Investigations on the Application of Avalanche Simulations: A Survey Conducted among Avalanche Experts
Authors: Korbinian Schmidtner, Rudolf Sailer, Perry Bartelt, Wolfgang Fellin, Jan-Thomas Fischer, Matthias Granig
Abstract:
This study focuses on the evaluation of snow avalanche simulations, based on a survey that has been carried out among avalanche experts. In the last decades, the application of avalanche simulation tools has gained recognition within the realm of hazard management. Traditionally, avalanche runout models were used to predict extreme avalanche runout and prepare avalanche maps. This has changed rather dramatically with the application of numerical models. For safety regulations such as road safety simulation tools are now being coupled with real-time meteorological measurements to predict frequent avalanche hazard. That places new demands on model accuracy and requires the simulation of physical processes that previously could be ignored. These simulation tools are based on a deterministic description of the avalanche movement allowing to predict certain quantities (e.g. pressure, velocities, flow heights, runout lengths etc.) of the avalanche flow. Because of the highly variable regimes of the flowing snow, no uniform rheological law describing the motion of an avalanche is known. Therefore, analogies to fluid dynamical laws of other materials are stated. To transfer these constitutional laws to snow flows, certain assumptions and adjustments have to be imposed. Besides these limitations, there exist high uncertainties regarding the initial and boundary conditions. Further challenges arise when implementing the underlying flow model equations into an algorithm executable by a computer. This implementation is constrained by the choice of adequate numerical methods and their computational feasibility. Hence, the model development is compelled to introduce further simplifications and the related uncertainties. In the light of these issues many questions arise on avalanche simulations, on their assets and drawbacks, on potentials for improvements as well as their application in practice. To address these questions a survey among experts in the field of avalanche science (e.g. researchers, practitioners, engineers) from various countries has been conducted. In the questionnaire, special attention is drawn on the expert’s opinion regarding the influence of certain variables on the simulation result, their uncertainty and the reliability of the results. Furthermore, it was tested to which degree a simulation result influences the decision making for a hazard assessment. A discrepancy could be found between a large uncertainty of the simulation input parameters as compared to a relatively high reliability of the results. This contradiction can be explained taking into account how the experts employ the simulations. The credibility of the simulations is the result of a rather thoroughly simulation study, where different assumptions are tested, comparing the results of different flow models along with the use of supplemental data such as chronicles, field observation, silent witnesses i.a. which are regarded as essential for the hazard assessment and for sanctioning simulation results. As the importance of avalanche simulations grows within the hazard management along with their further development studies focusing on the modeling fashion could contribute to a better understanding how knowledge of the avalanche process can be gained by running simulations.Keywords: expert interview, hazard management, modeling, simulation, snow avalanche
Procedia PDF Downloads 3265268 Blind Channel Estimation for Frequency Hopping System Using Subspace Based Method
Authors: M. M. Qasaymeh, M. A. Khodeir
Abstract:
Subspace channel estimation methods have been studied widely. It depends on subspace decomposition of the covariance matrix to separate signal subspace from noise subspace. The decomposition normally is done by either Eigenvalue Decomposition (EVD) or Singular Value Decomposition (SVD) of the Auto-Correlation matrix (ACM). However, the subspace decomposition process is computationally expensive. In this paper, the multipath channel estimation problem for a Slow Frequency Hopping (SFH) system using noise space based method is considered. An efficient method to estimate multipath the time delays basically is proposed, by applying MUltiple Signal Classification (MUSIC) algorithm which used the null space extracted by the Rank Revealing LU factorization (RRLU). The RRLU provides accurate information about the rank and the numerical null space which make it a valuable tool in numerical linear algebra. The proposed novel method decreases the computational complexity approximately to the half compared with RRQR methods keeping the same performance. Computer simulations are also included to demonstrate the effectiveness of the proposed scheme.Keywords: frequency hopping, channel model, time delay estimation, RRLU, RRQR, MUSIC, LS-ESPRIT
Procedia PDF Downloads 4105267 The Construction of the Bridge between Mrs Dalloway and to the Lighthouse: The Combination of Codes and Metaphors in the Structuring of the Plot in the Work of Virginia Woolf
Authors: María Rosa Mucci
Abstract:
Tzvetan Todorov (1971) designs a model of narrative transformation where the plot is constituted by difference and resemblance. This binary opposition is a synthesis of a central figure within narrative discourse: metaphor. Narrative operates as a metaphor since it combines different actions through similarities within a common plot. However, it sounds paradoxical that metonymy and not metaphor should be the key figure within the narrative. It is a metonymy that keeps the movement of actions within the story through syntagmatic relations. By the same token, this articulation of verbs makes it possible for the reader to engage in a dynamic interaction with the text, responding to the plot and mediating meanings with the contradictory external world. As Roland Barthes (1957) points out, there are two codes that are irreversible within the process: the codes of actions and the codes of enigmas. Virginia Woolf constructs her plots through a process of symbolism; a scene is always enduring, not only because it stands for something else but also because it connotes it. The reader is forced to elaborate the meaning at a mythological level beyond the lines. In this research, we follow a qualitative content analysis to code language through the proairetic (actions) and hermeneutic (enigmas) codes in terms of Barthes. There are two novels in particular that engage the reader in this process of construction: Mrs Dalloway (1925) and To the Lighthouse (1927). The bridge from the first to the second brings memories of childhood, allowing for the discovery of these enigmas hidden between the lines. What survives? Who survives? It is the reader's task to unravel these codes and rethink this dialogue between plot and reader to contribute to the predominance of texts and the textuality of narratives.Keywords: metonymy, code, metaphor, myth, textuality
Procedia PDF Downloads 595266 A Different Approach to Smart Phone-Based Wheat Disease Detection System Using Deep Learning for Ethiopia
Authors: Nathenal Thomas Lambamo
Abstract:
Based on the fact that more than 85% of the labor force and 90% of the export earnings are taken by agriculture in Ethiopia and it can be said that it is the backbone of the overall socio-economic activities in the country. Among the cereal crops that the agriculture sector provides for the country, wheat is the third-ranking one preceding teff and maize. In the present day, wheat is in higher demand related to the expansion of industries that use them as the main ingredient for their products. The local supply of wheat for these companies covers only 35 to 40% and the rest 60 to 65% percent is imported on behalf of potential customers that exhaust the country’s foreign currency reserves. The above facts show that the need for this crop in the country is too high and in reverse, the productivity of the crop is very less because of these reasons. Wheat disease is the most devastating disease that contributes a lot to this unbalance in the demand and supply status of the crop. It reduces both the yield and quality of the crop by 27% on average and up to 37% when it is severe. This study aims to detect the most frequent and degrading wheat diseases, Septoria and Leaf rust, using the most efficiently used subset of machine learning technology, deep learning. As a state of the art, a deep learning class classification technique called Convolutional Neural Network (CNN) has been used to detect diseases and has an accuracy of 99.01% is achieved.Keywords: septoria, leaf rust, deep learning, CNN
Procedia PDF Downloads 765265 Mathematics Bridging Theory and Applications for a Data-Driven World
Authors: Zahid Ullah, Atlas Khan
Abstract:
In today's data-driven world, the role of mathematics in bridging the gap between theory and applications is becoming increasingly vital. This abstract highlights the significance of mathematics as a powerful tool for analyzing, interpreting, and extracting meaningful insights from vast amounts of data. By integrating mathematical principles with real-world applications, researchers can unlock the full potential of data-driven decision-making processes. This abstract delves into the various ways mathematics acts as a bridge connecting theoretical frameworks to practical applications. It explores the utilization of mathematical models, algorithms, and statistical techniques to uncover hidden patterns, trends, and correlations within complex datasets. Furthermore, it investigates the role of mathematics in enhancing predictive modeling, optimization, and risk assessment methodologies for improved decision-making in diverse fields such as finance, healthcare, engineering, and social sciences. The abstract also emphasizes the need for interdisciplinary collaboration between mathematicians, statisticians, computer scientists, and domain experts to tackle the challenges posed by the data-driven landscape. By fostering synergies between these disciplines, novel approaches can be developed to address complex problems and make data-driven insights accessible and actionable. Moreover, this abstract underscores the importance of robust mathematical foundations for ensuring the reliability and validity of data analysis. Rigorous mathematical frameworks not only provide a solid basis for understanding and interpreting results but also contribute to the development of innovative methodologies and techniques. In summary, this abstract advocates for the pivotal role of mathematics in bridging theory and applications in a data-driven world. By harnessing mathematical principles, researchers can unlock the transformative potential of data analysis, paving the way for evidence-based decision-making, optimized processes, and innovative solutions to the challenges of our rapidly evolving society.Keywords: mathematics, bridging theory and applications, data-driven world, mathematical models
Procedia PDF Downloads 755264 Epileptic Seizure Onset Detection via Energy and Neural Synchronization Decision Fusion
Authors: Marwa Qaraqe, Muhammad Ismail, Erchin Serpedin
Abstract:
This paper presents a novel architecture for a patient-specific epileptic seizure onset detector using scalp electroencephalography (EEG). The proposed architecture is based on the decision fusion calculated from energy and neural synchronization related features. Specifically, one level of the detector calculates the condition number (CN) of an EEG matrix to evaluate the amount of neural synchronization present within the EEG channels. On a parallel level, the detector evaluates the energy contained in four EEG frequency subbands. The information is then fed into two independent (parallel) classification units based on support vector machines to determine the onset of a seizure event. The decisions from the two classifiers are then combined together according to two fusion techniques to determine a global decision. Experimental results demonstrate that the detector based on the AND fusion technique outperforms existing detectors with a sensitivity of 100%, detection latency of 3 seconds, while it achieves a 2:76 false alarm rate per hour. The OR fusion technique achieves a sensitivity of 100%, and significantly improves delay latency (0:17 seconds), yet it achieves 12 false alarms per hour.Keywords: epilepsy, EEG, seizure onset, electroencephalography, neuron, detection
Procedia PDF Downloads 4785263 A Study to Explore the Views of Students regarding E-Learning as an Instructional Tool at University Level
Authors: Zafar Iqbal
Abstract:
This study involved students of 6th semester enrolled in a Bachelor of Computer Science Program at university level. In this era of science and technology, e-learning can be helpful for grassroots in providing them access to education tenant in less developed areas. It is a potential substitute of face-to-face teaching being used in different countries. The purpose of the study was to explore the views of students about e-learning (Facebook) as an instructional tool. By using purposive sampling technique an intact class of 30 students included both male and female were selected where e-learning was used as an instructional tool. The views of students were explored through qualitative approach by using focus group interviews. The approach was helpful to develop comprehensive understanding of students’ views towards e- learning. In addition, probing questions were also asked and recorded. Data was transcribed, generated nodes and then coded text against these nodes. For this purpose and further analysis, NVivo 10 software was used. Themes were generated and tangibly presented through cluster analysis. Findings were interesting and provide sufficient evidence that face book is a subsequent e-learning source for students of higher education. Students acknowledged it as best source of learning and it was aligned with their academic and social behavior. It was not time specific and therefore, feasible for students who work day time and can get on line access to the material when they got free time. There were some distracters (time wasters) reported by the students but can be minimized by little effort. In short, e-learning is need of the day and potential learning source for every individual who have access to internet living at any part of the globe.Keywords: e-learning, facebook, instructional tool, higher education
Procedia PDF Downloads 3755262 Pre-Operative Psychological Factors Significantly Add to the Predictability of Chronic Narcotic Use: A Two Year Prospective Study
Authors: Dana El-Mughayyar, Neil Manson, Erin Bigney, Eden Richardson, Dean Tripp, Edward Abraham
Abstract:
Use of narcotics to treat pain has increased over the past two decades and is a contributing factor to the current public health crisis. Understanding the pre-operative risks of chronic narcotic use may be aided through investigation of psychological measures. The objective of the reported study is to determine predictors of narcotic use two years post-surgery in a thoracolumbar spine surgery population, including an array of psychological factors. A prospective observational study of 191 consecutively enrolled adult patients having undergone thoracolumbar spine surgery is presented. Baseline measures of interest included the Pain Catastrophizing Scale (PCS), Tampa Scale for Kinesiophobia, Multidimensional Scale for Perceived Social Support (MSPSS), Chronic Pain Acceptance Questionnaire (CPAQ-8), Oswestry Disability Index (ODI), Numeric Rating Scales for back and leg pain (NRS-B/L), SF-12’s Mental Component Summary (MCS), narcotic use and demographic variables. The post-operative measure of interest is narcotic use at 2-year follow-up. Narcotic use is collapsed into binary categories of use and no use. Descriptive statistics are run. Chi Square analysis is used for categorical variables and an ANOVA for continuous variables. Significant variables are built into a hierarchical logistic regression to determine predictors of post-operative narcotic use. Significance is set at α < 0.05. Results: A total of 27.23% of the sample were using narcotics two years after surgery. The regression model included ODI, NRS-Leg, time with condition, chief complaint, pre-operative drug use, gender, MCS, PCS subscale helplessness, and CPAQ subscale pain willingness and was significant χ² (13, N=191)= 54.99; p = .000. The model accounted for 39.6% of the variance in narcotic use and correctly predicted in 79.7% of cases. Psychological variables accounted for 9.6% of the variance over and above the other predictors. Conclusions: Managing chronic narcotic usage is central to the patient’s overall health and quality of life. Psychological factors in the preoperative period are significant predictors of narcotic use 2 years post-operatively. The psychological variables are malleable, potentially allowing surgeons to direct their patients to preventative resources prior to surgery.Keywords: narcotics, psychological factors, quality of life, spine surgery
Procedia PDF Downloads 1445261 A Hybrid of BioWin and Computational Fluid Dynamics Based Modeling of Biological Wastewater Treatment Plants for Model-Based Control
Authors: Komal Rathore, Kiesha Pierre, Kyle Cogswell, Aaron Driscoll, Andres Tejada Martinez, Gita Iranipour, Luke Mulford, Aydin Sunol
Abstract:
Modeling of Biological Wastewater Treatment Plants requires several parameters for kinetic rate expressions, thermo-physical properties, and hydrodynamic behavior. The kinetics and associated mechanisms become complex due to several biological processes taking place in wastewater treatment plants at varying times and spatial scales. A dynamic process model that incorporated the complex model for activated sludge kinetics was developed using the BioWin software platform for an Advanced Wastewater Treatment Plant in Valrico, Florida. Due to the extensive number of tunable parameters, an experimental design was employed for judicious selection of the most influential parameter sets and their bounds. The model was tuned using both the influent and effluent plant data to reconcile and rectify the forecasted results from the BioWin Model. Amount of mixed liquor suspended solids in the oxidation ditch, aeration rates and recycle rates were adjusted accordingly. The experimental analysis and plant SCADA data were used to predict influent wastewater rates and composition profiles as a function of time for extended periods. The lumped dynamic model development process was coupled with Computational Fluid Dynamics (CFD) modeling of the key units such as oxidation ditches in the plant. Several CFD models that incorporate the nitrification-denitrification kinetics, as well as, hydrodynamics was developed and being tested using ANSYS Fluent software platform. These realistic and verified models developed using BioWin and ANSYS were used to plan beforehand the operating policies and control strategies for the biological wastewater plant accordingly that further allows regulatory compliance at minimum operational cost. These models, with a little bit of tuning, can be used for other biological wastewater treatment plants as well. The BioWin model mimics the existing performance of the Valrico Plant which allowed the operators and engineers to predict effluent behavior and take control actions to meet the discharge limits of the plant. Also, with the help of this model, we were able to find out the key kinetic and stoichiometric parameters which are significantly more important for modeling of biological wastewater treatment plants. One of the other important findings from this model were the effects of mixed liquor suspended solids and recycle ratios on the effluent concentration of various parameters such as total nitrogen, ammonia, nitrate, nitrite, etc. The ANSYS model allowed the abstraction of information such as the formation of dead zones increases through the length of the oxidation ditches as compared to near the aerators. These profiles were also very useful in studying the behavior of mixing patterns, effect of aerator speed, and use of baffles which in turn helps in optimizing the plant performance.Keywords: computational fluid dynamics, flow-sheet simulation, kinetic modeling, process dynamics
Procedia PDF Downloads 2105260 Theorising Chinese as a Foreign Language Curriculum Justice in the Australian School Context
Authors: Wen Xu
Abstract:
The expansion of Confucius institutes and Chinese as a Foreign Language (CFL) education is often considered as cultural invasion and part of much bigger, if not ambitious, Chinese central government agenda among Western public opinion. The CFL knowledge and teaching practice inherent in textbooks are also harshly critiqued as failing to align with Western educational principles. This paper takes up these concerns and attempts to articulate that Confucius’s idea of ‘education without discrimination’ appears to have become synonymous with social justice touted in contemporary Australian education and policy discourses. To do so, it capitalises on Bernstein's conceptualization of classification and pedagogic rights to articulate CFL curriculum's potential of drawing in and drawing out curriculum boundaries to achieve educational justice. In this way, the potential useful knowledge of CFL constitutes a worthwhile tool to engage in a peripheral Western country’s education issues, as well as to include disenfranchised students in the multicultural Australian society. It opens spaces for critically theorising CFL curricular justice in Australian educational contexts, and makes an original contribution to scholarly argumentation that CFL curriculum has the potential of including socially and economically disenfranchised students in schooling.Keywords: curriculum justice, Chinese as a Foreign Language curriculum, Bernstein, equity
Procedia PDF Downloads 1445259 Global Supply Chain Tuning: Role of National Culture
Authors: Aleksandr S. Demin, Anastasiia V. Ivanova
Abstract:
Purpose: The current economy tends to increase the influence of digital technologies and diminish the human role in management. However, it is impossible to deny that a person still leads a business with its own set of values and priorities. The article presented aims to incorporate the peculiarities of the national culture and the characteristics of the supply chain using the quantitative values of the national culture obtained by the scholars of comparative management (Hofstede, House, and others). Design/Methodology/Approach: The conducted research is based on the secondary data in the field of cross-country comparison achieved by Prof. Hofstede and received in the GLOBE project. The data mentioned are used to design different aspects of the supply chain both on the cross-functional and inter-organizational levels. The connection between a range of principles in general (roles assignment, customer service prioritization, coordination of supply chain partners) and in comparative management (acknowledgment of the national peculiarities of the country in which the company operates) is shown over economic and mathematical models, mainly linear programming models. Findings: The combination of the team management wheel concept, the business processes of the global supply chain, and the national culture characteristics let a transnational corporation to form a supply chain crew balanced in costs, functions, and personality. To elaborate on an effective customer service policy and logistics strategy in goods and services distribution in the country under review, two approaches are offered. The first approach relies exceptionally on the customer’s interest in the place of operation, while the second one takes into account the position of the transnational corporation and its previous experience in order to accord both organizational and national cultures. The effect of integration practice on the achievement of a specific supply chain goal in a specific location is advised to assess via types of correlation (positive, negative, non) and the value of national culture indices. Research Limitations: The models developed are intended to be used by transnational companies and business forms located in several nationally different areas. Some of the inputs to illustrate the application of the methods offered are simulated. That is why the numerical measurements should be used with caution. Practical Implications: The research can be of great interest for the supply chain managers who are responsible for the engineering of global supply chains in a transnational corporation and the further activities in doing business on the international area. As well, the methods, tools, and approaches suggested can be used by top managers searching for new ways of competitiveness and can be suitable for all staff members who are keen on the national culture traits topic. Originality/Value: The elaborated methods of decision-making with regard to the national environment suggest the mathematical and economic base to find a comprehensive solution.Keywords: logistics integration, logistics services, multinational corporation, national culture, team management, service policy, supply chain management
Procedia PDF Downloads 1065258 An Extensible Software Infrastructure for Computer Aided Custom Monitoring of Patients in Smart Homes
Authors: Ritwik Dutta, Marylin Wolf
Abstract:
This paper describes the trade-offs and the design from scratch of a self-contained, easy-to-use health dashboard software system that provides customizable data tracking for patients in smart homes. The system is made up of different software modules and comprises a front-end and a back-end component. Built with HTML, CSS, and JavaScript, the front-end allows adding users, logging into the system, selecting metrics, and specifying health goals. The back-end consists of a NoSQL Mongo database, a Python script, and a SimpleHTTPServer written in Python. The database stores user profiles and health data in JSON format. The Python script makes use of the PyMongo driver library to query the database and displays formatted data as a daily snapshot of user health metrics against target goals. Any number of standard and custom metrics can be added to the system, and corresponding health data can be fed automatically, via sensor APIs or manually, as text or picture data files. A real-time METAR request API permits correlating weather data with patient health, and an advanced query system is implemented to allow trend analysis of selected health metrics over custom time intervals. Available on the GitHub repository system, the project is free to use for academic purposes of learning and experimenting, or practical purposes by building on it.Keywords: flask, Java, JavaScript, health monitoring, long-term care, Mongo, Python, smart home, software engineering, webserver
Procedia PDF Downloads 3905257 The Initiation of Privatization, Market Structure, and Free Entry with Vertically Related Markets
Authors: Hung-Yi Chen, Shih-Jye Wu
Abstract:
The existing literature provides little discussion on why a public monopolist gives up its market dominant position and allows private firms entering the market. We argue that the privatization of a public monopolist under a vertically related market may induce the entry of private firms. We develop a model of a mixed oligopoly with vertically related markets to explain the change in the market from a public monopolist to a mixed oligopoly and examine issues on privatizing the downstream public enterprise both in the short run and long run in the vertically related markets. We first show that the welfare-maximizing public monopoly firm is suboptimal in the vertically related markets. This is due to the fact that the privatization will reduce the input price charged by the upstream foreign monopolist. Further, the privatization will induce the entry of private firms since input price will decrease after privatization. Third, we demonstrate that the complete privatizing the public firm becomes a possible solution if the entry cost of private firm is low. Finally, we indicate that the public firm should partially privatize if the free-entry of private firms is allowed. JEL classification: F12, F14, L32, L33Keywords: free entry, mixed oligopoly, public monopoly, the initiation of privatization, vertically related markets, mixed oligopoly
Procedia PDF Downloads 1375256 Evaluation of the Effect of Milk Recording Intervals on the Accuracy of an Empirical Model Fitted to Dairy Sheep Lactations
Authors: L. Guevara, Glória L. S., Corea E. E, A. Ramírez-Zamora M., Salinas-Martinez J. A., Angeles-Hernandez J. C.
Abstract:
Mathematical models are useful for identifying the characteristics of sheep lactation curves to develop and implement improved strategies. However, the accuracy of these models is influenced by factors such as the recording regime, mainly the intervals between test day records (TDR). The current study aimed to evaluate the effect of different TDR intervals on the goodness of fit of the Wood model (WM) applied to dairy sheep lactations. A total of 4,494 weekly TDRs from 156 lactations of dairy crossbred sheep were analyzed. Three new databases were generated from the original weekly TDR data (7D), comprising intervals of 14(14D), 21(21D), and 28(28D) days. The parameters of WM were estimated using the “minpack.lm” package in the R software. The shape of the lactation curve (typical and atypical) was defined based on the WM parameters. The goodness of fit was evaluated using the mean square of prediction error (MSPE), Root of MSPE (RMSPE), Akaike´s Information Criterion (AIC), Bayesian´s Information Criterion (BIC), and the coefficient of correlation (r) between the actual and estimated total milk yield (TMY). WM showed an adequate estimate of TMY regardless of the TDR interval (P=0.21) and shape of the lactation curve (P=0.42). However, we found higher values of r for typical curves compared to atypical curves (0.9vs.0.74), with the highest values for the 28D interval (r=0.95). In the same way, we observed an overestimated peak yield (0.92vs.6.6 l) and underestimated time of peak yield (21.5vs.1.46) in atypical curves. The best values of RMSPE were observed for the 28D interval in both lactation curve shapes. The significant lowest values of AIC (P=0.001) and BIC (P=0.001) were shown by the 7D interval for typical and atypical curves. These results represent the first approach to define the adequate interval to record the regime of dairy sheep in Latin America and showed a better fitting for the Wood model using a 7D interval. However, it is possible to obtain good estimates of TMY using a 28D interval, which reduces the sampling frequency and would save additional costs to dairy sheep producers.Keywords: gamma incomplete, ewes, shape curves, modeling
Procedia PDF Downloads 785255 The Achievement Model of University Social Responsibility
Authors: Le Kang
Abstract:
On the research question of 'how to achieve USR', this contribution reflects the concept of university social responsibility, identify three achievement models of USR as the society - diversified model, the university-cooperation model, the government - compound model, also conduct a case study to explore characteristics of Chinese achievement model of USR. The contribution concludes with discussion of how the university, government and society balance demands and roles, make necessarily strategic adjustment and innovative approach to repair the shortcomings of each achievement model.Keywords: modern university, USR, achievement model, compound model
Procedia PDF Downloads 7585254 Modelling Retirement Outcomes: An Australian Case Study
Authors: Colin O’Hare, Zili Zho, Thomas Sneddon
Abstract:
The Australian superannuation system has received high praise for its participation rates and level of funding in retirement yet it is only 25 years old. In recent years, with increasing longevity and persistent lower rates of investment return, how adequate will the funds accumulated through a superannuation system be? In this paper we take Australia as a case study and build a stochastic model of accumulation and decummulation of funds and determine the expected number of years a fund may last an individual in retirement.Keywords: component, mortality, stochastic models, superannuation
Procedia PDF Downloads 245