Search results for: technical trading signal
3167 In-Game Business and the Problem of Gambling: Legal Analysis of Loot Boxes from the Perspective of Iranian Law
Authors: Vesali Naseh Morteza, Najafi Mohammad Hosein
Abstract:
The possibility of trading in-game items for real money provides a high economic capacity for online games and turns them into a business model. Nowadays, the market for in-game item purchases and microtransactions or micropayments has been growing increasingly. Since the market should be legal, lawyers and lawmakers around the world have expressed concerns over the legality of online gaming and in-game transactions. The issue is highlighted by the recent emergence of an in-game business model in the name of loot boxes. Similarities between loot boxes gaming and gambling features activities have started a legal debate as to whether loot boxes constitute a form of gambling or whether the game’s use of loot boxes should be considered gambling. Hence, based on the relationship between loot boxes purchasing and problem gambling, the paper investigates the legal effect of the newly emergent phenomenon of loot boxes on online games from the perspective of Iranian law.Keywords: serious games, loot boxes, online gambling, in-game purchase, virtual items
Procedia PDF Downloads 1073166 Drugs, Silk Road, Bitcoins
Authors: Lali Khurtsia, Vano Tsertsvadze
Abstract:
Georgian drug policy is directed to reduce the supply of drugs. Retrospective analysis has shown that law enforcement activities have been followed by the expulsion of particular injecting drugs. The demand remains unchanged and drugs are substituted by the hand-made, even more dangerous homemade drugs entered the market. To find out expected new trends on the Georgian drug market, qualitative study was conducted with Georgian drug users to determine drug supply routes. It turned out that drug suppliers and consumers for safety reasons and to protect their anonymity, use Skype to make deals. IT in illegal drug trade is even more sophisticated in the worldwide. Trading with Bitcoins in the Darknet ensures high confidentiality of money transactions and the safe circulation of drugs. In 2014 largest Bitcoin mining enterprise in the world was built in Georgia. We argue that the use of Bitcoins and Darknet by Georgian drug consumers and suppliers will be an incentive to response adequately to the government's policy of restricting supply in order to satisfy market demand for drugs.Keywords: bitcoin, darknet, drugs, policy
Procedia PDF Downloads 4413165 Design of Labview Based DAQ System
Authors: Omar A. A. Shaebi, Matouk M. Elamari, Salaheddin Allid
Abstract:
The Information Computing System of Monitoring (ICSM) for the Research Reactor of Tajoura Nuclear Research Centre (TNRC) stopped working since early 1991. According to the regulations, the computer is necessary to operate the reactor up to its maximum power (10 MW). The fund is secured via IAEA to develop a modern computer based data acquisition system to replace the old computer. This paper presents the development of the Labview based data acquisition system to allow automated measurements using National Instruments Hardware and its labview software. The developed system consists of SCXI 1001 chassis, the chassis house four SCXI 1100 modules each can maintain 32 variables. The chassis is interfaced with the PC using NI PCI-6023 DAQ Card. Labview, developed by National Instruments, is used to run and operate the DAQ System. Labview is graphical programming environment suited for high level design. It allows integrating different signal processing components or subsystems within a graphical framework. The results showed system capabilities in monitoring variables, acquiring and saving data. Plus the capability of the labview to control the DAQ.Keywords: data acquisition, labview, signal conditioning, national instruments
Procedia PDF Downloads 4963164 Impact of Geomagnetic Storm on Ionosphere
Authors: Affan Ahmed
Abstract:
This research investigates the impact of the geomagnetic storm occurring from April 22 to April 26, 2023, on the Earth’s ionosphere, with a focus on analyzing specific ionospheric parameters to understand the storm's effects on ionospheric stability and GNSS signal propagation. Geomagnetic storms, caused by intensified solar wind-magnetosphere interactions, can significantly disturb ionospheric conditions, impacting electron density, Total Electron Content (TEC), and thermospheric composition. Such disturbances are particularly relevant to satellite-based navigation and communication systems, as fluctuations in ionospheric parameters can degrade signal integrity and reliability. In this study, data were obtained from multiple sources, including OMNIWeb for parameters like Dst, Kp, Bz, Electric Field, and solar wind pressure, GUVI for O/N₂ ratio maps, and TEC data from low-, mid-, and high-latitude stations available on the IONOLAB website. Additional Equatorial Electrojet (EEJ) and geomagnetic data were acquired from INTERMAGNET. The methodology involved comparing storm-affected data from April 22 to April 26 with quiet days in April 2023, using statistical and wavelet analysis to assess variations in parameters like TEC, O/N₂ ratio, and geomagnetic indices. The results show pronounced fluctuations in TEC and other ionospheric parameters during the main phase of the storm, with spatial variations observed across latitudes, highlighting the global response of the ionosphere to geomagnetic disturbances. The findings underline the storm’s significant impact on ionospheric composition, particularly in mid- and high-latitude regions, which correlates with increased GNSS signal interference in these areas. This study contributes to understanding the ionosphere’s response to geomagnetic activity, emphasizing the need for robust models to predict and mitigate space weather effects on GNSS-dependent technologies.Keywords: geomagnetic storms, ionospheric disturbances, space weather effects, magnetosphere-ionosphere coupling
Procedia PDF Downloads 133163 Impact of Trade Cooperation of BRICS Countries on Economic Growth
Authors: Svetlana Gusarova
Abstract:
The essential role in the recent development of world economy has led to the developing countries, notably to BRICS countries (Brazil, Russia, India, China, South Africa). Over the next 50 years the BRICS countries are expected to be the engines of global trade and economic growth. Trade cooperation of BRICS countries can enhance their economic development. BRICS countries were among Top 10 world exporters of office and telecom equipment, of textiles, of clothing, of iron and steel, of chemicals, of agricultural products, of automotive products, of fuel and mining products. China was one of the main trading partners of all BRICS countries, maintaining close relationship with all BRICS countries in the development of trade. Author analyzed trade complementarity of BRICS countries and revealed the high level of complementarity of their trade flows in connection with availability of specialization in different types of goods. The correlation and regression analysis of communication of Intra-BRICS merchandise turnover and their GDP (PPP) revealed very strong impact on the development of their economies.Keywords: BRICS countries, trade cooperation, complementarity, regression analysis
Procedia PDF Downloads 2843162 Stability Analysis and Controller Design of Further Development of Miniaturized Mössbauer Spectrometer II for Space Applications with Focus on the Extended Lyapunov Method – Part I –
Authors: Mohammad Beyki, Justus Pawlak, Robert Patzke, Franz Renz
Abstract:
In the context of planetary exploration, the MIMOS II (miniaturized Mössbauer spectrometer) serves as a proven and reliable measuring instrument. The transmission behaviour of the electronics in the Mössbauer spectroscopy is newly developed and optimized. For this purpose, the overall electronics is split into three parts. This elaboration deals exclusively with the first part of the signal chain for the evaluation of photons in experiments with gamma radiation. Parallel to the analysis of the electronics, a new method for the stability consideration of linear and non-linear systems is presented: The extended method of Lyapunov’s stability criteria. The design helps to weigh advantages and disadvantages against other simulated circuits in order to optimize the MIMOS II for the terestric and extraterestric measurment. Finally, after stability analysis, the controller design according to Ackermann is performed, achieving the best possible optimization of the output variable through a skillful pole assignment.Keywords: Mössbauer spectroscopy, electronic signal amplifier, light processing technology, photocurrent, trans-impedance amplifier, extended Lyapunov method
Procedia PDF Downloads 1003161 Heterogeneity, Asymmetry and Extreme Risk Perception; Dynamic Evolution Detection From Implied Risk Neutral Density
Authors: Abderrahmen Aloulou, Younes Boujelbene
Abstract:
The current paper displays a new method of extracting information content from options prices by eliminating biases caused by daily variation of contract maturity. Based on Kernel regression tool, this non-parametric technique serves to obtain a spectrum of interpolated options with constant maturity horizons from negotiated optional contracts on the S&P TSX 60 index. This method makes it plausible to compare daily risk neutral densities from which extracting time continuous indicators allows the detection traders attitudes’ evolution, such as, belief homogeneity, asymmetry and extreme Risk Perception. Our findings indicate that the applied method contribute to develop effective trading strategies and to adjust monetary policies through controlling trader’s reactions to economic and monetary news.Keywords: risk neutral densities, kernel, constant maturity horizons, homogeneity, asymmetry and extreme risk perception
Procedia PDF Downloads 4883160 Feasibility Study of Implementing Electronic Commerce in Food Industries with a Case Study
Authors: Maryam Safarirad
Abstract:
Fast and increasing growth of electronic commerce (e-commerce) in developed countries and its resulting competitive advantages mean that those countries should revise dramatically their trade and commercial strategies and policies. Regarding the importance of food industry in Iran, the current paper studies the feasibility of implementing the e-commerce system in Shiraz’s petrochemical unit. The statistical population of the study includes 29 senior managers and experts of the food industries. In the present Feasibility study of implementing electronic commerce 249 research, senior managers and experts’ opinions on feasibility have been examined and some feedbacks have resulted in from the opinions. The current research concludes that the organization under study does not have favorable state either in software or in hardware. Implementation of the e-commerce system in food industries would reduce the average value of the transaction costs.Keywords: electronic trading, electronic commerce, electronic exchange of information, feasibility study, information technology, virtual shopping, computer networks, electronic commerce laws, food industry
Procedia PDF Downloads 4163159 DesignChain: Automated Design of Products Featuring a Large Number of Variants
Authors: Lars Rödel, Jonas Krebs, Gregor Müller
Abstract:
The growing price pressure due to the increasing number of global suppliers, the growing individualization of products and ever-shorter delivery times are upcoming challenges in the industry. In this context, Mass Personalization stands for the individualized production of customer products in batch size 1 at the price of standardized products. The possibilities of digitalization and automation of technical order processing open up the opportunity for companies to significantly reduce their cost of complexity and lead times and thus enhance their competitiveness. Many companies already use a range of CAx tools and configuration solutions today. Often, the expert knowledge of employees is hidden in "knowledge silos" and is rarely networked across processes. DesignChain describes the automated digital process from the recording of individual customer requirements, through design and technical preparation, to production. Configurators offer the possibility of mapping variant-rich products within the Design Chain. This transformation of customer requirements into product features makes it possible to generate even complex CAD models, such as those for large-scale plants, on a rule-based basis. With the aid of an automated CAx chain, production-relevant documents are thus transferred digitally to production. This process, which can be fully automated, allows variants to always be generated on the basis of current version statuses.Keywords: automation, design, CAD, CAx
Procedia PDF Downloads 763158 Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison
Authors: Ramón Aparicio-García, Gustavo Juárez Gracia, Jesús Álvarez Cedillo
Abstract:
A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing.Keywords: affective computing, interface, brain, intelligent interaction
Procedia PDF Downloads 3903157 Predictive Maintenance of Electrical Induction Motors Using Machine Learning
Authors: Muhammad Bilal, Adil Ahmed
Abstract:
This study proposes an approach for electrical induction motor predictive maintenance utilizing machine learning algorithms. On the basis of a study of temperature data obtained from sensors put on the motor, the goal is to predict motor failures. The proposed models are trained to identify whether a motor is defective or not by utilizing machine learning algorithms like Support Vector Machines (SVM) and K-Nearest Neighbors (KNN). According to a thorough study of the literature, earlier research has used motor current signature analysis (MCSA) and vibration data to forecast motor failures. The temperature signal methodology, which has clear advantages over the conventional MCSA and vibration analysis methods in terms of cost-effectiveness, is the main subject of this research. The acquired results emphasize the applicability and effectiveness of the temperature-based predictive maintenance strategy by demonstrating the successful categorization of defective motors using the suggested machine learning models.Keywords: predictive maintenance, electrical induction motors, machine learning, temperature signal methodology, motor failures
Procedia PDF Downloads 1193156 Development of an NIR Sorting Machine, an Experimental Study in Detecting Internal Disorder and Quality of Apple Fruitpple Fruit
Authors: Eid Alharbi, Yaser Miaji
Abstract:
The quality level for fresh fruits is very important for the fruit industries. In presents study, an automatic online sorting system according to the internal disorder for fresh apple fruit has developed by using near infrared (NIR) spectroscopic technology. The automatic conveyer belts system along with sorting mechanism was constructed. To check the internal quality of the apple fruit, apple was exposed to the NIR radiations in the range 650-1300nm and the data were collected in form of absorption spectra. The collected data were compared to the reference (data of known sample) analyzed and an electronic signal was pass to the sorting system. The sorting system was separate the apple fruit samples according to electronic signal passed to the system. It is found that absorption of NIR radiation in the range 930-950nm was higher in the internally defected samples as compared to healthy samples. On the base of this high absorption of NIR radiation in 930-950nm region the online sorting system was constructed.Keywords: mechatronics, NIR, fruit quality, spectroscopic technology, mechatronic design
Procedia PDF Downloads 3913155 A Simple and Efficient Method for Accurate Measurement and Control of Power Frequency Deviation
Authors: S. J. Arif
Abstract:
In the presented technique, a simple method is given for accurate measurement and control of power frequency deviation. The sinusoidal signal for which the frequency deviation measurement is required is transformed to a low voltage level and passed through a zero crossing detector to convert it into a pulse train. Another stable square wave signal of 10 KHz is obtained using a crystal oscillator and decade dividing assemblies (DDA). These signals are combined digitally and then passed through decade counters to give a unique combination of pulses or levels, which are further encoded to make them equally suitable for both control applications and display units. The developed circuit using discrete components has a resolution of 0.5 Hz and completes measurement within 20 ms. The realized circuit is simulated and synthesized using Verilog HDL and subsequently implemented on FPGA. The results of measurement on FPGA are observed on a very high resolution logic analyzer. These results accurately match the simulation results as well as the results of same circuit implemented with discrete components. The proposed system is suitable for accurate measurement and control of power frequency deviation.Keywords: digital encoder for frequency measurement, frequency deviation measurement, measurement and control systems, power systems
Procedia PDF Downloads 3773154 Influence of Geomagnetic Storms on Ionospheric Parameters
Authors: Affan Ahmed
Abstract:
This research investigates the Influence of geomagnetic storm occurring from April 22 to April 26, 2023, on the Earth’s ionosphere, with a focus on analyzing specific ionospheric parameters to understand the storm's effects on ionospheric stability and GNSS signal propagation. Geomagnetic storms, caused by intensified solar wind-magnetosphere interactions, can significantly disturb ionospheric conditions, impacting electron density, Total Electron Content (TEC), and thermospheric composition. Such disturbances are particularly relevant to satellite-based navigation and communication systems, as fluctuations in ionospheric parameters can degrade signal integrity and reliability. In this study, data were obtained from multiple sources, including OMNIWeb for parameters like Dst, Kp, Bz, Electric Field, and solar wind pressure, GUVI for O/N₂ ratio maps, and TEC data from low-, mid-, and high-latitude stations available on the IONOLAB website. Additional Equatorial Electrojet (EEJ) and geomagnetic data were acquired from INTERMAGNET. The methodology involved comparing storm-affected data from April 22 to April 26 with quiet days in April 2023, using statistical and wavelet analysis to assess variations in parameters like TEC, O/N₂ ratio, and geomagnetic indices. The results show pronounced fluctuations in TEC and other ionospheric parameters during the main phase of the storm, with spatial variations observed across latitudes, highlighting the global response of the ionosphere to geomagnetic disturbances. The findings underline the storm’s significant impact on ionospheric composition, particularly in mid- and high-latitude regions, which correlates with increased GNSS signal interference in these areas. This study contributes to understanding the ionosphere’s response to geomagnetic activity, emphasizing the need for robust models to predict and mitigate space weather effects on GNSS-dependent technologies.Keywords: geomagnetic storms, ionospheric disturbances, space weather effects, magnetosphere-ionopheric coupling
Procedia PDF Downloads 133153 Method Optimisation for [¹⁸F]-FDG Rodent Imaging Studies
Authors: J. Visser, C. Driver, T. Ebenhan
Abstract:
[¹⁸F]-FDG (fluorodeoxyglucose) is a radiopharmaceutical compound that is used for non-invasive cancer tumor imaging through positron emission tomography (PET). This radiopharmaceutical is used to visualise the metabolic processes in tumour tissues, which can be applied for the diagnosis and prognosis of various types of cancer. [¹⁸F]-FDG has widespread use in both clinical and pre-clinical research settings. Imaging using [¹⁸F]-FDG results in representative normal tissue distribution as well as visualisation of hypermetabolic lesions ([¹⁸F]-FDG avid foci). The metabolic tissue concentration of these lesions following [¹⁸F]-FDG administration can be quantified using Standard Uptake Values (SUV). Standard uptake values of [¹⁸F]-FDG-based Positron Emission Tomography can be influenced by various biological and technical handling factors. Biological factors that affect [¹⁸F]-FDG uptake include the blood glucose levels of subjects, normal physiological variants between subjects and administration of certain pharmaceutical agents. Technical factors that can have an effect include the route of radiopharmaceutical or pharmaceutical agents administered and environmental conditions such as ambient temperature and lighting. These factors influencing tracer uptake need to be investigated to improve the robustness of the imaging protocol, which will achieve reproducible image acquisition across various research projects, optimised tumor visualisation and increased data validity and reliability.Keywords: fluorodeoxyglucose, tumour imaging, Rodent, Blood Glucose, PET/CT Imaging
Procedia PDF Downloads 163152 Robust Data Image Watermarking for Data Security
Authors: Harsh Vikram Singh, Ankur Rai, Anand Mohan
Abstract:
In this paper, we propose secure and robust data hiding algorithm based on DCT by Arnold transform and chaotic sequence. The watermark image is scrambled by Arnold cat map to increases its security and then the chaotic map is used for watermark signal spread in middle band of DCT coefficients of the cover image The chaotic map can be used as pseudo-random generator for digital data hiding, to increase security and robustness .Performance evaluation for robustness and imperceptibility of proposed algorithm has been made using bit error rate (BER), normalized correlation (NC), and peak signal to noise ratio (PSNR) value for different watermark and cover images such as Lena, Girl, Tank images and gain factor .We use a binary logo image and text image as watermark. The experimental results demonstrate that the proposed algorithm achieves higher security and robustness against JPEG compression as well as other attacks such as addition of noise, low pass filtering and cropping attacks compared to other existing algorithm using DCT coefficients. Moreover, to recover watermarks in proposed algorithm, there is no need to original cover image.Keywords: data hiding, watermarking, DCT, chaotic sequence, arnold transforms
Procedia PDF Downloads 5153151 Computational Characterization of Electronic Charge Transfer in Interfacial Phospholipid-Water Layers
Authors: Samira Baghbanbari, A. B. P. Lever, Payam S. Shabestari, Donald Weaver
Abstract:
Existing signal transmission models, although undoubtedly useful, have proven insufficient to explain the full complexity of information transfer within the central nervous system. The development of transformative models will necessitate a more comprehensive understanding of neuronal lipid membrane electrophysiology. Pursuant to this goal, the role of highly organized interfacial phospholipid-water layers emerges as a promising case study. A series of phospholipids in neural-glial gap junction interfaces as well as cholesterol molecules have been computationally modelled using high-performance density functional theory (DFT) calculations. Subsequent 'charge decomposition analysis' calculations have revealed a net transfer of charge from phospholipid orbitals through the organized interfacial water layer before ultimately finding its way to cholesterol acceptor molecules. The specific pathway of charge transfer from phospholipid via water layers towards cholesterol has been mapped in detail. Cholesterol is an essential membrane component that is overrepresented in neuronal membranes as compared to other mammalian cells; given this relative abundance, its apparent role as an electronic acceptor may prove to be a relevant factor in further signal transmission studies of the central nervous system. The timescales over which this electronic charge transfer occurs have also been evaluated by utilizing a system design that systematically increases the number of water molecules separating lipids and cholesterol. Memory loss through hydrogen-bonded networks in water can occur at femtosecond timescales, whereas existing action potential-based models are limited to micro or nanosecond scales. As such, the development of future models that attempt to explain faster timescale signal transmission in the central nervous system may benefit from our work, which provides additional information regarding fast timescale energy transfer mechanisms occurring through interfacial water. The study possesses a dataset that includes six distinct phospholipids and a collection of cholesterol. Ten optimized geometric characteristics (features) were employed to conduct binary classification through an artificial neural network (ANN), differentiating cholesterol from the various phospholipids. This stems from our understanding that all lipids within the first group function as electronic charge donors, while cholesterol serves as an electronic charge acceptor.Keywords: charge transfer, signal transmission, phospholipids, water layers, ANN
Procedia PDF Downloads 753150 Topics of Blockchain Technology to Teach at Community College
Authors: Penn P. Wu, Jeannie Jo
Abstract:
Blockchain technology has rapidly gained popularity in industry. This paper attempts to assist academia to answer four questions. First, should community colleges begin offering education to nurture blockchain-literate students for the job market? Second, what are the appropriate topical areas to cover? Third, should it be an individual course? And forth, should it be a technical or management course? This paper starts with identifying the knowledge domains of blockchain technology and the topical areas each domain has, and continues with placing them in appropriate academic territories (Computer Sciences vs. Business) and subjects (programming, management, marketing, and laws), and then develops an evaluation model to determine the appropriate topical area for community colleges to teach. The evaluation is based on seven factors: maturity of technology, impacts on management, real-world applications, subject classification, knowledge prerequisites, textbook readiness, and recommended pedagogies. The evaluation results point to an interesting direction that offering an introductory course is an ideal option to guide students through the learning journey of what blockchain is and how it applies to business. Such an introductory course does not need to engage students in the discussions of mathematics and sciences that make blockchain technologies possible. While it is inevitable to brief technical topics to help students build a solid knowledge foundation of blockchain technologies, community colleges should avoid offering students a course centered on the discussion of developing blockchain applications.Keywords: blockchain, pedagogies, blockchain technologies, blockchain course, blockchain pedagogies
Procedia PDF Downloads 1333149 Simscape Library for Large-Signal Physical Network Modeling of Inertial Microelectromechanical Devices
Authors: S. Srinivasan, E. Cretu
Abstract:
The information flow (e.g. block-diagram or signal flow graph) paradigm for the design and simulation of Microelectromechanical (MEMS)-based systems allows to model MEMS devices using causal transfer functions easily, and interface them with electronic subsystems for fast system-level explorations of design alternatives and optimization. Nevertheless, the physical bi-directional coupling between different energy domains is not easily captured in causal signal flow modeling. Moreover, models of fundamental components acting as building blocks (e.g. gap-varying MEMS capacitor structures) depend not only on the component, but also on the specific excitation mode (e.g. voltage or charge-actuation). In contrast, the energy flow modeling paradigm in terms of generalized across-through variables offers an acausal perspective, separating clearly the physical model from the boundary conditions. This promotes reusability and the use of primitive physical models for assembling MEMS devices from primitive structures, based on the interconnection topology in generalized circuits. The physical modeling capabilities of Simscape have been used in the present work in order to develop a MEMS library containing parameterized fundamental building blocks (area and gap-varying MEMS capacitors, nonlinear springs, displacement stoppers, etc.) for the design, simulation and optimization of MEMS inertial sensors. The models capture both the nonlinear electromechanical interactions and geometrical nonlinearities and can be used for both small and large signal analyses, including the numerical computation of pull-in voltages (stability loss). Simscape behavioral modeling language was used for the implementation of reduced-order macro models, that present the advantage of a seamless interface with Simulink blocks, for creating hybrid information/energy flow system models. Test bench simulations of the library models compare favorably with both analytical results and with more in-depth finite element simulations performed in ANSYS. Separate MEMS-electronic integration tests were done on closed-loop MEMS accelerometers, where Simscape was used for modeling the MEMS device and Simulink for the electronic subsystem.Keywords: across-through variables, electromechanical coupling, energy flow, information flow, Matlab/Simulink, MEMS, nonlinear, pull-in instability, reduced order macro models, Simscape
Procedia PDF Downloads 1373148 Complex Decision Rules in Quality Assurance Processes for Quick Service Restaurant Industry: Human Factors Determining Acceptability
Authors: Brandon Takahashi, Marielle Hanley, Gerry Hanley
Abstract:
The large-scale quick-service restaurant industry is a complex business to manage optimally. With over 40 suppliers providing different ingredients for food preparation and thousands of restaurants serving over 50 unique food offerings across a wide range of regions, the company must implement a quality assurance process. Businesses want to deliver quality food efficiently, reliably, and successfully at a low cost that the public wants to buy. They also want to make sure that their food offerings are never unsafe to eat or of poor quality. A good reputation (and profitable business) developed over the years can be gone in an instant if customers fall ill eating your food. Poor quality also results in food waste, and the cost of corrective actions is compounded by the reduction in revenue. Product compliance evaluation assesses if the supplier’s ingredients are within compliance with the specifications of several attributes (physical, chemical, organoleptic) that a company will test to ensure that a quality, safe to eat food is given to the consumer and will deliver the same eating experience in all parts of the country. The technical component of the evaluation includes the chemical and physical tests that produce numerical results that relate to shelf-life, food safety, and organoleptic qualities. The psychological component of the evaluation includes organoleptic, which is acting on or involving the use of the sense organs. The rubric for product compliance evaluation has four levels: (1) Ideal: Meeting or exceeding all technical (physical and chemical), organoleptic, & psychological specifications. (2) Deviation from ideal but no impact on quality: Not meeting or exceeding some technical and organoleptic/psychological specifications without impact on consumer quality and meeting all food safety requirements (3) Acceptable: Not meeting or exceeding some technical and organoleptic/psychological specifications resulting in reduction of consumer quality but not enough to lessen demand and meeting all food safety requirements (4) Unacceptable: Not meeting food safety requirements, independent of meeting technical and organoleptic specifications or meeting all food safety requirements but product quality results in consumer rejection of food offering. Sampling of products and consumer tastings within the distribution network is a second critical element of the quality assurance process and are the data sources for the statistical analyses. Each finding is not independently assessed with the rubric. For example, the chemical data will be used to back up/support any inferences on the sensory profiles of the ingredients. Certain flavor profiles may not be as apparent when mixed with other ingredients, which leads to weighing specifications differentially in the acceptability decision. Quality assurance processes are essential to achieve that balance of quality and profitability by making sure the food is safe and tastes good but identifying and remediating product quality issues before they hit the stores. Comprehensive quality assurance procedures implement human factors methodologies, and this report provides recommendations for systemic application of quality assurance processes for quick service restaurant services. This case study will review the complex decision rubric and evaluate processes to ensure the right balance of cost, quality, and safety is achieved.Keywords: decision making, food safety, organoleptics, product compliance, quality assurance
Procedia PDF Downloads 1913147 Generic Early Warning Signals for Program Student Withdrawals: A Complexity Perspective Based on Critical Transitions and Fractals
Authors: Sami Houry
Abstract:
Complex systems exhibit universal characteristics as they near a tipping point. Among them are common generic early warning signals which precede critical transitions. These signals include: critical slowing down in which the rate of recovery from perturbations decreases over time; an increase in the variance of the state variable; an increase in the skewness of the state variable; an increase in the autocorrelations of the state variable; flickering between different states; and an increase in spatial correlations over time. The presence of the signals has management implications, as the identification of the signals near the tipping point could allow management to identify intervention points. Despite the applications of the generic early warning signals in various scientific fields, such as fisheries, ecology and finance, a review of literature did not identify any applications that address the program student withdrawal problem at the undergraduate distance universities. This area could benefit from the application of generic early warning signals as the program withdrawal rate amongst distance students is higher than the program withdrawal rate at face-to-face conventional universities. This research specifically assessed the generic early warning signals through an intensive case study of undergraduate program student withdrawal at a Canadian distance university. The university is non-cohort based due to its system of continuous course enrollment where students can enroll in a course at the beginning of every month. The assessment of the signals was achieved through the comparison of the incidences of generic early warning signals among students who withdrew or simply became inactive in their undergraduate program of study, the true positives, to the incidences of the generic early warning signals among graduates, the false positives. This was achieved through significance testing. Research findings showed support for the signal pertaining to the rise in flickering which is represented in the increase in the student’s non-pass rates prior to withdrawing from a program; moderate support for the signals of critical slowing down as reflected in the increase in the time a student spends in a course; and moderate support for the signals on increase in autocorrelation and increase in variance in the grade variable. The findings did not support the signal on the increase in skewness of the grade variable. The research also proposes a new signal based on the fractal-like characteristic of student behavior. The research also sought to extend knowledge by investigating whether the emergence of a program withdrawal status is self-similar or fractal-like at multiple levels of observation, specifically the program level and the course level. In other words, whether the act of withdrawal at the program level is also present at the course level. The findings moderately supported self-similarity as a potential signal. Overall, the assessment of the signals suggests that the signals, with the exception with the increase of skewness, could be utilized as a predictive management tool and potentially add one more tool, the fractal-like characteristic of withdrawal, as an additional signal in addressing the student program withdrawal problem.Keywords: critical transitions, fractals, generic early warning signals, program student withdrawal
Procedia PDF Downloads 1853146 Dynamic Control Theory: A Behavioral Modeling Approach to Demand Forecasting amongst Office Workers Engaged in a Competition on Energy Shifting
Authors: Akaash Tawade, Manan Khattar, Lucas Spangher, Costas J. Spanos
Abstract:
Many grids are increasing the share of renewable energy in their generation mix, which is causing the energy generation to become less controllable. Buildings, which consume nearly 33% of all energy, are a key target for demand response: i.e., mechanisms for demand to meet supply. Understanding the behavior of office workers is a start towards developing demand response for one sector of building technology. The literature notes that dynamic computational modeling can be predictive of individual action, especially given that occupant behavior is traditionally abstracted from demand forecasting. Recent work founded on Social Cognitive Theory (SCT) has provided a promising conceptual basis for modeling behavior, personal states, and environment using control theoretic principles. Here, an adapted linear dynamical system of latent states and exogenous inputs is proposed to simulate energy demand amongst office workers engaged in a social energy shifting game. The energy shifting competition is implemented in an office in Singapore that is connected to a minigrid of buildings with a consistent 'price signal.' This signal is translated into a 'points signal' by a reinforcement learning (RL) algorithm to influence participant energy use. The dynamic model functions at the intersection of the points signals, baseline energy consumption trends, and SCT behavioral inputs to simulate future outcomes. This study endeavors to analyze how the dynamic model trains an RL agent and, subsequently, the degree of accuracy to which load deferability can be simulated. The results offer a generalizable behavioral model for energy competitions that provides the framework for further research on transfer learning for RL, and more broadly— transactive control.Keywords: energy demand forecasting, social cognitive behavioral modeling, social game, transfer learning
Procedia PDF Downloads 1093145 Characterization of Kevlar 29 for Multifunction Applications
Authors: Doaa H. Elgohary, Dina M. Hamoda, S. Yahia
Abstract:
Technical textiles refer to textile materials that are engineered and designed to have specific functionalities and performance characteristics beyond their traditional use as apparel or upholstery fabrics. These textiles are usually developed for their unique properties such as strength, durability, flame retardancy, chemical resistance, waterproofing, insulation and other special properties. The development and use of technical textiles are constantly evolving, driven by advances in materials science, manufacturing technologies and the demand for innovative solutions in various industries. Kevlar 29 is a type of aramid fiber developed by DuPont. It is a high-performance material known for its exceptional strength and resistance to impact, abrasion, and heat. Kevlar 29 belongs to the Kevlar family, which includes different types of aramid fibers. Kevlar 29 is primarily used in applications that require strength and durability, such as ballistic protection, body armor, and body armor for military and law enforcement personnel. It is also used in the aerospace and automotive industries to reinforce composite materials, as well as in various industrial applications. Two different Kevlar samples were used coated with cooper lithium silicate (CLS); ten different mechanical and physical properties (weight, thickness, tensile strength, elongation, stiffness, air permeability, puncture resistance, thermal conductivity, stiffness, and spray test) were conducted to approve its functional performance efficiency. The influence of different mechanical properties was statistically analyzed using an independent t-test with a significant difference at P-value = 0.05. The radar plot was calculated and evaluated to determine the best-performing samples. The results of the independent t-test observed that all variables were significantly affected by yarn counts except water permeability, which has no significant effect. All properties were evaluated for samples 1 and 2, a radar chart was used to determine the best attitude for samples. The radar chart area was calculated, which shows that sample 1 recorded the best performance, followed by sample 2. The surface morphology of all samples and the coating materials was determined using a scanning electron microscope (SEM), also Fourier Transform Infrared Spectroscopy Measurement for the two samples.Keywords: cooper lithium silicate, independent t-test, kevlar, technical textiles.
Procedia PDF Downloads 813144 Design of Enhanced Adaptive Filter for Integrated Navigation System of FOG-SINS and Star Tracker
Authors: Nassim Bessaad, Qilian Bao, Zhao Jiangkang
Abstract:
The fiber optics gyroscope in the strap-down inertial navigation system (FOG-SINS) suffers from precision degradation due to the influence of random errors. In this work, an enhanced Allan variance (AV) stochastic modeling method combined with discrete wavelet transform (DWT) for signal denoising is implemented to estimate the random process in the FOG signal. Furthermore, we devise a measurement-based iterative adaptive Sage-Husa nonlinear filter with augmented states to integrate a star tracker sensor with SINS. The proposed filter adapts the measurement noise covariance matrix based on the available data. Moreover, the enhanced stochastic modeling scheme is invested in tuning the process noise covariance matrix and the augmented state Gauss-Markov process parameters. Finally, the effectiveness of the proposed filter is investigated by employing the collected data in laboratory conditions. The result shows the filter's improved accuracy in comparison with the conventional Kalman filter (CKF).Keywords: inertial navigation, adaptive filtering, star tracker, FOG
Procedia PDF Downloads 803143 Investigating Income Diversification Strategies into Off-Farm Activities Among Rural Households in Ethiopia
Authors: Kibret Berhanu Getinet
Abstract:
Off-farm income diversification by farm rural households has gained the attention of researchers and policymakers due to the fact that agriculture failed to meet the needs of people in developing countries like Ethiopia. The objective of this study was to investigate income diversification strategies into off-farm activities among rural households in Hawassa Zuria Woreda, Sidama National Regional State, Ethiopia. The study used primary and secondary data sources for the primary data collection questionnaire employed as a data collection instrument. A multistage sampling technique was used to collect data from a total of 197 sample households from four kebeles of the study area. Descriptive statistics, as well as econometrics methods of data analysis, were employed. The descriptive statistics result indicates that the majority of sample rural households (68.53 %) have engaged in off-farm income diversification activities while the remaining 31.47% of households did not participate in the diversification in the study area. The choice of participants among the strategies indicates that 6.60% of respondents participated in off-farm wage employment, 30.46% participated in off-farm self-employment, and about 31.47% of them participated in both off-farm wage employment. The study revealed that the share of off-farm income in total annual earnings of households was about 48.457%, and thus, the off-farm diversification significantly contributes to the rural household income. Moreover, binary and multinomial logistic regression models were employed to identify factors that affect the participation and the choices of the off-farm income diversification strategies, respectively. The binary logit model result indicated that agro-ecological zone, education status of the households, available technical skills of the household, household saving, total livestock owned by the households, access to electricity, road access and being married of household head were significant and positively affected the chance of diversification in off-farm activities while the on-farm income of households is negatively affected the chance of diversification. Similarly, the multinomial logistic regression model estimate revealed that agroecological zone, on-farm income, available technical skills, household savings, and access to electricity are positively related and significantly influenced the household’s choice of employment into off-farm wage employment. The off-farm self-employment diversification choice is significantly influenced by on-farm income, available technical skills, household savings, total livestock owned, and access to electricity. Moreover, the result showed that the factors that affect the choice of farm households to engage in both off-farm wage and self-employment are ecological zone, education status, on-farm income, available technical skills, household own saving, market access, total livestock owned, access to electricity and road access. Thus, due attention should be given to addressing the demographic, socio-economic, and institutional constraints to strengthen off-farm income diversification strategies to improve the income of rural households.Keywords: off-farm, incoem, diversification, logit model
Procedia PDF Downloads 553142 Design of Regular Communication Area for Infrared Electronic-Toll-Collection Systems
Authors: Wern-Yarng Shieh, Chao Qian, Bingnan Pei
Abstract:
A design of communication area for infrared electronic-toll-collection systems to provide an extended communication interval in the vehicle traveling direction and regular boundary between contiguous traffic lanes is proposed. By utilizing two typical low-cost commercial infrared LEDs with different half-intensity angles Φ1/2 = 22° and 10°, the radiation pattern of the emitter is designed to properly adjust the spatial distribution of the signal power. The aforementioned purpose can be achieved with an LED array in a three-piece structure with appropriate mounting angles. With this emitter, the influence of the mounting parameters, including the mounting height and mounting angles of the on-board unit and road-side unit, on the system performance in terms of the received signal strength and communication area are investigated. The results reveal that, for our emitter proposed in this paper, the ideal "long-and-narrow" characteristic of the communication area is very little affected by these mounting parameters. An optimum mounting configuration is also suggested.Keywords: dedicated short-range communication (DSRC), electronic toll collection (ETC), infrared communication, intelligent transportation system (ITS), multilane free flow
Procedia PDF Downloads 3373141 An Innovative Auditory Impulsed EEG and Neural Network Based Biometric Identification System
Authors: Ritesh Kumar, Gitanjali Chhetri, Mandira Bhatia, Mohit Mishra, Abhijith Bailur, Abhinav
Abstract:
The prevalence of the internet and technology in our day to day lives is creating more security issues than ever. The need for protecting and providing a secure access to private and business data has led to the development of many security systems. One of the potential solutions is to employ the bio-metric authentication technique. In this paper we present an innovative biometric authentication method that utilizes a person’s EEG signal, which is acquired in response to an auditory stimulus,and transferred wirelessly to a computer that has the necessary ANN algorithm-Multi layer perceptrol neural network because of is its ability to differentiate between information which is not linearly separable.In order to determine the weights of the hidden layer we use Gaussian random weight initialization. MLP utilizes a supervised learning technique called Back propagation for training the network. The complex algorithm used for EEG classification reduces the chances of intrusion into the protected public or private data.Keywords: EEG signal, auditory evoked potential, biometrics, multilayer perceptron neural network, back propagation rule, Gaussian random weight initialization
Procedia PDF Downloads 4133140 The Capacity of Mel Frequency Cepstral Coefficients for Speech Recognition
Authors: Fawaz S. Al-Anzi, Dia AbuZeina
Abstract:
Speech recognition is of an important contribution in promoting new technologies in human computer interaction. Today, there is a growing need to employ speech technology in daily life and business activities. However, speech recognition is a challenging task that requires different stages before obtaining the desired output. Among automatic speech recognition (ASR) components is the feature extraction process, which parameterizes the speech signal to produce the corresponding feature vectors. Feature extraction process aims at approximating the linguistic content that is conveyed by the input speech signal. In speech processing field, there are several methods to extract speech features, however, Mel Frequency Cepstral Coefficients (MFCC) is the popular technique. It has been long observed that the MFCC is dominantly used in the well-known recognizers such as the Carnegie Mellon University (CMU) Sphinx and the Markov Model Toolkit (HTK). Hence, this paper focuses on the MFCC method as the standard choice to identify the different speech segments in order to obtain the language phonemes for further training and decoding steps. Due to MFCC good performance, the previous studies show that the MFCC dominates the Arabic ASR research. In this paper, we demonstrate MFCC as well as the intermediate steps that are performed to get these coefficients using the HTK toolkit.Keywords: speech recognition, acoustic features, mel frequency, cepstral coefficients
Procedia PDF Downloads 2603139 The Immunology Evolutionary Relationship between Signal Transducer and Activator of Transcription Genes from Three Different Shrimp Species in Response to White Spot Syndrome Virus Infection
Authors: T. C. C. Soo, S. Bhassu
Abstract:
Unlike the common presence of both innate and adaptive immunity in vertebrates, crustaceans, in particular, shrimps, have been discovered to possess only innate immunity. This further emphasizes the importance of innate immunity within shrimps in pathogenic resistance. Under the study of pathogenic immune challenge, different shrimp species actually exhibit varying degrees of immune resistance towards the same pathogen. Furthermore, even within the same shrimp species, different batches of challenged shrimps can have different strengths of immune defence. Several important pathways are activated within shrimps during pathogenic infection. One of them is JAK-STAT pathway that is activated during bacterial, viral and fungal infections by which STAT(Signal Transducer and Activator of Transcription) gene is the core element of the pathway. Based on theory of Central Dogma, the genomic information is transmitted in the order of DNA, RNA and protein. This study is focused in uncovering the important evolutionary patterns present within the DNA (non-coding region) and RNA (coding region). The three shrimp species involved are Macrobrachium rosenbergii, Penaeus monodon and Litopenaeus vannamei which all possess commercial significance. The shrimp species were challenged with a famous penaeid shrimp virus called white spot syndrome virus (WSSV) which can cause serious lethality. Tissue samples were collected during time intervals of 0h, 3h, 6h, 12h, 24h, 36h and 48h. The DNA and RNA samples were then extracted using conventional kits from the hepatopancreas tissue samples. PCR technique together with designed STAT gene conserved primers were utilized for identification of the STAT coding sequences using RNA-converted cDNA samples and subsequent characterization using various bioinformatics approaches including Ramachandran plot, ProtParam and SWISS-MODEL. The varying levels of immune STAT gene activation for the three shrimp species during WSSV infection were confirmed using qRT-PCR technique. For one sample, three biological replicates with three technical replicates each were used for qRT-PCR. On the other hand, DNA samples were important for uncovering the structural variations within the genomic region of STAT gene which would greatly assist in understanding the STAT protein functional variations. The partially-overlapping primers technique was used for the genomic region sequencing. The evolutionary inferences and event predictions were then conducted through the Bayesian Inference method using all the acquired coding and non-coding sequences. This was supplemented by the construction of conventional phylogenetic trees using Maximum likelihood method. The results showed that adaptive evolution caused STAT gene sequence mutations between different shrimp species which led to evolutionary divergence event. Subsequently, the divergent sites were correlated to the differing expressions of STAT gene. Ultimately, this study assists in knowing the shrimp species innate immune variability and selection of disease resistant shrimps for breeding purpose. The deeper understanding of STAT gene evolution from the perspective of both purifying and adaptive approaches not only can provide better immunological insight among shrimp species, but also can be used as a good reference for immunological studies in humans or other model organisms.Keywords: gene evolution, JAK-STAT pathway, immunology, STAT gene
Procedia PDF Downloads 1513138 Course Outcomes to Programme Outcomes Mapping: A Methodology Based on Key Elements
Authors: Twarakavi Venkata Suresh Kumar, Sailaja Kumar, B. Eswara Reddy
Abstract:
In a world of tremendous technical developments, effective and efficient higher education has always been a major challenge. The rising number of educational institutions have made it mandatory for healthy competitions among the institutions. To evaluate the qualitative competence of these educations institutions in engineering and technology and related disciplines, an efficient assessment technique in internal and external quality has to be followed. To achieve this, the curriculum is to be developed into courses, and each course has to be presented in the form teaching lesson plan consisting of topics and session outcome known as Course Outcomes (COs), that easily map into different Programme Outcomes (POs). The major objective of these methodologies is to provide quality technical education to its students. Detailed clear weightage in CO-PO mapping helps in proper measurable COs and to devise the POs attainment is an important issue. This ensures in assisting the achievement of the POs with proper weightage to POs, and also improves the successive curriculum development. In this paper, we presented a methodology for mapping CO and PO considering the key elements supported by each PO. This approach is useful in evaluating the attainment of POs which is based on the attainment of COs using the existing data from students' marks taken from various test items. Such direct assessment tools are used to measure the degree to which each student has achieved each course learning outcome by the completion of the course. Hence, these results are also useful in measuring the PO attainment for improving the programme vision and mission.Keywords: attainment, course outcomes, programme outcomes, educational institutions
Procedia PDF Downloads 468