Search results for: robust filtering
873 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles
Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis
Abstract:
Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.Keywords: big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review
Procedia PDF Downloads 163872 Comparison of Unit Hydrograph Models to Simulate Flood Events at the Field Scale
Authors: Imene Skhakhfa, Lahbaci Ouerdachi
Abstract:
To ensure the overall coherence of simulated results, it is necessary to develop a robust validation process. In many applications, it is no longer content to calibrate and validate the model only in relation to the hydro graph measured at the outlet, but we try to better simulate the functioning of the watershed in space. Therefore the timing also performs compared to other variables such as water level measurements in intermediate stations or groundwater levels. As part of this work, we limit ourselves to modeling flood of short duration for which the process of evapotranspiration is negligible. The main parameters to identify the models are related to the method of unit hydro graph (HU). Three different models were tested: SNYDER, CLARK and SCS. These models differ in their mathematical structure and parameters to be calibrated while hydrological data are the same, the initial water content and precipitation. The models are compared on the basis of their performance in terms six objective criteria, three global criteria and three criteria representing volume, peak flow, and the mean square error. The first type of criteria gives more weight to strong events whereas the second considers all events to be of equal weight. The results show that the calibrated parameter values are dependent and also highlight the problems associated with the simulation of low flow events and intermittent precipitation.Keywords: model calibration, intensity, runoff, hydrograph
Procedia PDF Downloads 486871 3D Human Reconstruction over Cloud Based Image Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Human action recognition modeling is a critical task in machine learning. These systems require better techniques for recognizing body parts and selecting optimal features based on vision sensors to identify complex action patterns efficiently. Still, there is a considerable gap and challenges between images and videos, such as brightness, motion variation, and random clutters. This paper proposes a robust approach for classifying human actions over cloud-based image data. First, we apply pre-processing and detection, human and outer shape detection techniques. Next, we extract valuable information in terms of cues. We extract two distinct features: fuzzy local binary patterns and sequence representation. Then, we applied a greedy, randomized adaptive search procedure for data optimization and dimension reduction, and for classification, we used a random forest. We tested our model on two benchmark datasets, AAMAZ and the KTH Multi-view football datasets. Our HMR framework significantly outperforms the other state-of-the-art approaches and achieves a better recognition rate of 91% and 89.6% over the AAMAZ and KTH multi-view football datasets, respectively.Keywords: computer vision, human motion analysis, random forest, machine learning
Procedia PDF Downloads 42870 A Joint Possibilistic-Probabilistic Tool for Load Flow Uncertainty Assessment-Part II: Case Studies
Authors: Morteza Aien, Masoud Rashidinejad, Mahmud Fotuhi-Firuzabad
Abstract:
Power systems are innately uncertain systems. To face with such uncertain systems, robust uncertainty assessment tools are appealed. This paper inspects the uncertainty assessment formulation of the load flow (LF) problem considering different kinds of uncertainties, developed in its companion paper through some case studies. The proposed methodology is based on the evidence theory and joint propagation of possibilistic and probabilistic uncertainties. The load and wind power generation are considered as probabilistic uncertain variables and the electric vehicles (EVs) and gas turbine distributed generation (DG) units are considered as possibilistic uncertain variables. The cumulative distribution function (CDF) of the system output parameters obtained by the pure probabilistic method lies within the belief and plausibility functions obtained by the joint propagation approach. Furthermore, the imprecision in the DG parameters is explicitly reflected by the gap between the belief and plausibility functions. This gap, due to the epistemic uncertainty on the DG resources parameters grows as the penetration level increases.Keywords: electric vehicles, joint possibilistic- probabilistic uncertainty modeling, uncertain load flow, wind turbine generator
Procedia PDF Downloads 432869 Optimizing Pediatric Pneumonia Diagnosis with Lightweight MobileNetV2 and VAE-GAN Techniques in Chest X-Ray Analysis
Authors: Shriya Shukla, Lachin Fernando
Abstract:
Pneumonia, a leading cause of mortality in young children globally, presents significant diagnostic challenges, particularly in resource-limited settings. This study presents an approach to diagnosing pediatric pneumonia using Chest X-Ray (CXR) images, employing a lightweight MobileNetV2 model enhanced with synthetic data augmentation. Addressing the challenge of dataset scarcity and imbalance, the study used a Variational Autoencoder-Generative Adversarial Network (VAE-GAN) to generate synthetic CXR images, improving the representation of normal cases in the pediatric dataset. This approach not only addresses the issues of data imbalance and scarcity prevalent in medical imaging but also provides a more accessible and reliable diagnostic tool for early pneumonia detection. The augmented data improved the model’s accuracy and generalization, achieving an overall accuracy of 95% in pneumonia detection. These findings highlight the efficacy of the MobileNetV2 model, offering a computationally efficient yet robust solution well-suited for resource-constrained environments such as mobile health applications. This study demonstrates the potential of synthetic data augmentation in enhancing medical image analysis for critical conditions like pediatric pneumonia.Keywords: pneumonia, MobileNetV2, image classification, GAN, VAE, deep learning
Procedia PDF Downloads 127868 ECG Based Reliable User Identification Using Deep Learning
Authors: R. N. Begum, Ambalika Sharma, G. K. Singh
Abstract:
Identity theft has serious ramifications beyond data and personal information loss. This necessitates the implementation of robust and efficient user identification systems. Therefore, automatic biometric recognition systems are the need of the hour, and ECG-based systems are unquestionably the best choice due to their appealing inherent characteristics. The CNNs are the recent state-of-the-art techniques for ECG-based user identification systems. However, the results obtained are significantly below standards, and the situation worsens as the number of users and types of heartbeats in the dataset grows. As a result, this study proposes a highly accurate and resilient ECG-based person identification system using CNN's dense learning framework. The proposed research explores explicitly the calibre of dense CNNs in the field of ECG-based human recognition. The study tests four different configurations of dense CNN which are trained on a dataset of recordings collected from eight popular ECG databases. With the highest FAR of 0.04 percent and the highest FRR of 5%, the best performing network achieved an identification accuracy of 99.94 percent. The best network is also tested with various train/test split ratios. The findings show that DenseNets are not only extremely reliable but also highly efficient. Thus, they might also be implemented in real-time ECG-based human recognition systems.Keywords: Biometrics, Dense Networks, Identification Rate, Train/Test split ratio
Procedia PDF Downloads 164867 Suitability of the Sport Motivation Scale–II for Use in Jr. High Physical Education: A Confirmatory Factor Analysis
Authors: Keven A. Prusak, William F. Christensen, Zack Beddoes
Abstract:
Background: For more than a decade, the Sport Motivation Scale (SMS) has been used to measure contextual motivation across a variety of sporting and physical education (PE) settings but not without criticism as to its tenability. Consequently, a new version of the sport motivation scale (SMS-II) was created to address certain weakness of the original SMS. Purpose: The purpose of this study is to assess the suitability of the SMS-II in the secondary PE setting. Methods: Three hundred and twenty (204 females, and 116 males; grades 7-9) completed the 18-item, six-subscale SMS-II at the end of a required PE class. Factor means, standard deviations, and correlations were calculated and further examined via confirmatory factor analysis (CFA). Model parameters were estimated maximum likelihood function. Results: Results indicate that participants held generally positive perceptions toward PE as a context (more so for males than females). Reliability analysis yielded adequate alphas (rα = 0.71 to 0.87, Mα = 0.78) with the exception of the AM subscale (αAM = .64). Correlation analysis indicated some support for the SIMPLEX pattern, but distal ends of the motivation continuum displayed no relationship. CFA yielded robust fit indices to the proposed structure of the SMS-II for PE. A small but significant variance across genders was noted and discussed. Conclusions: In all, the SMS-II suitably accesses PE context-specific motivational indices within Jr. High PE.Keywords: motivation, self-determination theory, physical education, confirmatory factor analysis
Procedia PDF Downloads 333866 An Alternative Framework of Multi-Resolution Nested Weighted Essentially Non-Oscillatory Schemes for Solving Euler Equations with Adaptive Order
Authors: Zhenming Wang, Jun Zhu, Yuchen Yang, Ning Zhao
Abstract:
In the present paper, an alternative framework is proposed to construct a class of finite difference multi-resolution nested weighted essentially non-oscillatory (WENO) schemes with an increasingly higher order of accuracy for solving inviscid Euler equations. These WENO schemes firstly obtain a set of reconstruction polynomials by a hierarchy of nested central spatial stencils, and then recursively achieve a higher order approximation through the lower-order precision WENO schemes. The linear weights of such WENO schemes can be set as any positive numbers with a requirement that their sum equals one and they will not pollute the optimal order of accuracy in smooth regions and could simultaneously suppress spurious oscillations near discontinuities. Numerical results obtained indicate that these alternative finite-difference multi-resolution nested WENO schemes with different accuracies are very robust with low dissipation and use as few reconstruction stencils as possible while maintaining the same efficiency, achieving the high-resolution property without any equivalent multi-resolution representation. Besides, its finite volume form is easier to implement in unstructured grids.Keywords: finite-difference, WENO schemes, high order, inviscid Euler equations, multi-resolution
Procedia PDF Downloads 146865 Asset Pricing Model: A Quality Paradigm
Authors: Urmi Khatri
Abstract:
Capital asset pricing model (CAPM) draws a direct relationship between the risk and the expected rate of return. There was a criticism on the beta and the assumptions of CAPM, as they are not applicable in the real world. Fama French Three Factor Model and Fama French Five Factor Model have given different factors, which have an impact on the return of any asset like size, value, investment and profitability. This study proposes to see Capital Asset pricing Model through the lenses of the quality aspect. In the study, the six factors are studied. The Fama French Five Factor Model and addition of the quality dimension are studied. Here, Graham’s seven quality and quantity criteria are measured to determine the score of the sample firms. Thus, this study tries to check the model fit. The beta coefficient of the quality dimension and the R square value is seen to determine validity of the proposed model. The sample is drawn from the firms listed on Indian Stock Exchange (BSE). For the study, only nonfinancial firms are been selected. The time period of the study is from January 1999 to December 2019. Hence, the primary objective of the study is to check how robust the model becomes after giving the quality dimension to the capital asset pricing model in addition to the size, value, profitability and investment.Keywords: asset pricing model, CAPM, Graham’s score, G-score, multifactor model, quality
Procedia PDF Downloads 160864 Multi-Tooled Robotic Hand for Tele-Operation of Explosive Devices
Authors: Faik Derya Ince, Ugur Topgul, Alp Gunay, Can Bayoglu, Dante J. Dorantes-Gonzalez
Abstract:
Explosive attacks are arguably the most lethal threat that may occur in terrorist attacks. In order to counteract this issue, explosive ordnance disposal operators put their lives on the line to dispose of a possible improvised explosive device. Robots can make the disposal process more accurately and saving human lives. For this purpose, there is a demand for more accurate and dexterous manipulating robotic hands that can be teleoperated from a distance. The aim of this project is to design a robotic hand that contains two active and two passive DOF for each finger, as well as a minimum set of tools for mechanical cutting and screw driving within the same robotic hand. Both hand and toolset, are teleoperated from a distance from a haptic robotic glove in order to manipulate dangerous objects such as improvised explosive devices. SolidWorks® Computer-Aided Design, computerized dynamic simulation, and MATLAB® kinematic and static analysis were used for the robotic hand and toolset design. Novel, dexterous and robust solutions for the fingers were obtained, and six servo motors are used in total to remotely control the multi-tooled robotic hand. This project is still undergoing and presents currents results. Future research steps are also presented.Keywords: Explosive Manipulation, Robotic Hand, Tele-Operation, Tool Integration
Procedia PDF Downloads 143863 Conformity and Differentiation in CSR Practices on Capital Market Performance: Empirical Evidence from Stock Liquidity and Price Crash Risk
Authors: Jie Zhang, Chaomin Zhang, Jihua Zhang, Haitong Li
Abstract:
Using the theory of optimal distinctiveness, this study examines the effects of conformity and differentiation within corporate social responsibility (CSR) practices on capital market performance. Analysing data from Chinese A-share listed firms from 2007 to 2022, this paper demonstrates that when firms conform to the expected scope of CSR, such behaviour enhances investor attention and market acceptance, thereby boosting stock liquidity. Conversely, emphasising differentiation in CSR practices more effectively mitigates stock price crash risk by addressing principal–agent problems and decreasing information asymmetry. This paper also investigates how organisational and environmental factors moderate the relationship between conformity and differentiation in CSR practices and their impact on capital market performance. The results also show that the influence of conformity on stock liquidity is accentuated in smaller firms and environments with stringent legal oversight. By contrast, the benefits of differentiation in reducing stock price crash risk are amplified in firms with robust corporate governance and markets characterised by high uncertainty.Keywords: corporate social responsibility, social responsibility practices, capital market performance, optimal distinctiveness
Procedia PDF Downloads 23862 Cardiokey: A Binary and Multi-Class Machine Learning Approach to Identify Individuals Using Electrocardiographic Signals on Wearable Devices
Authors: S. Chami, J. Chauvin, T. Demarest, Stan Ng, M. Straus, W. Jahner
Abstract:
Biometrics tools such as fingerprint and iris are widely used in industry to protect critical assets. However, their vulnerability and lack of robustness raise several worries about the protection of highly critical assets. Biometrics based on Electrocardiographic (ECG) signals is a robust identification tool. However, most of the state-of-the-art techniques have worked on clinical signals, which are of high quality and less noisy, extracted from wearable devices like a smartwatch. In this paper, we are presenting a complete machine learning pipeline that identifies people using ECG extracted from an off-person device. An off-person device is a wearable device that is not used in a medical context such as a smartwatch. In addition, one of the main challenges of ECG biometrics is the variability of the ECG of different persons and different situations. To solve this issue, we proposed two different approaches: per person classifier, and one-for-all classifier. The first approach suggests making binary classifier to distinguish one person from others. The second approach suggests a multi-classifier that distinguishes the selected set of individuals from non-selected individuals (others). The preliminary results, the binary classifier obtained a performance 90% in terms of accuracy within a balanced data. The second approach has reported a log loss of 0.05 as a multi-class score.Keywords: biometrics, electrocardiographic, machine learning, signals processing
Procedia PDF Downloads 142861 Diagrid Structural System
Authors: K. Raghu, Sree Harsha
Abstract:
The interrelationship between the technology and architecture of tall buildings is investigated from the emergence of tall buildings in late 19th century to the present. In the late 19th century early designs of tall buildings recognized the effectiveness of diagonal bracing members in resisting lateral forces. Most of the structural systems deployed for early tall buildings were steel frames with diagonal bracings of various configurations such as X, K, and eccentric. Though the historical research a filtering concept is developed original and remedial technology- through which one can clearly understand inter-relationship between the technical evolution and architectural esthetic and further stylistic transition buildings. Diagonalized grid structures – “diagrids” - have emerged as one of the most innovative and adaptable approaches to structuring buildings in this millennium. Variations of the diagrid system have evolved to the point of making its use non-exclusive to the tall building. Diagrid construction is also to be found in a range of innovative mid-rise steel projects. Contemporary design practice of tall buildings is reviewed and design guidelines are provided for new design trends. Investigated in depths are the behavioral characteristics and design methodology for diagrids structures, which emerge as a new direction in the design of tall buildings with their powerful structural rationale and symbolic architectural expression. Moreover, new technologies for tall building structures and facades are developed for performance enhancement through design integration, and their architectural potentials are explored. By considering the above data the analysis and design of 40-100 storey diagrids steel buildings is carried out using E-TABS software with diagrids of various angle to be found for entire building which will be helpful to reduce the steel requirement for the structure. The present project will have to undertake wind analysis, seismic analysis for lateral loads acting on the structure due to wind loads, earthquake loads, gravity loads. All structural members are designed as per IS 800-2007 considering all load combination. Comparison of results in terms of time period, top storey displacement and inter-storey drift to be carried out. The secondary effect like temperature variations are not considered in the design assuming small variation.Keywords: diagrid, bracings, structural, building
Procedia PDF Downloads 387860 Transformative Digital Trends in Supply Chain Management: The Role of Artificial Intelligence
Authors: Srinivas Vangari
Abstract:
With the technological advancements around the globe, artificial intelligence (AI) has boosted supply chain management (SCM) by improving efficiency, sensitivity, and promptness. Artificial intelligence-based SCM provides comprehensive perceptions of consumer behavior in dynamic market situations and trends, foreseeing the accurate demand. It reduces overproduction and stockouts while optimizing production planning and streamlining operations. Consequently, the AI-driven SCM produces a customer-centric supply with resilient and robust operations. Intending to delve into the transformative significance of AI in SCM, this study focuses on improving efficiency in SCM with the integration of AI, understanding the production demand, accurate forecasting, and particular production planning. The study employs a mixed-method approach and expert survey insights to explore the challenges and benefits of AI applications in SCM. Further, a case analysis is incorporated to identify the best practices and potential challenges with the critical success features in AI-driven SCM. Key findings of the study indicate the significant advantages of the AI-integrated SCM, including optimized inventory management, improved transportation and logistics management, cost optimization, and advanced decision-making, positioning AI as a pivotal force in the future of supply chain management.Keywords: artificial intelligence, supply chain management, accurate forecast, accurate planning of production, understanding demand
Procedia PDF Downloads 23859 Development of Prediction Models of Day-Ahead Hourly Building Electricity Consumption and Peak Power Demand Using the Machine Learning Method
Authors: Dalin Si, Azizan Aziz, Bertrand Lasternas
Abstract:
To encourage building owners to purchase electricity at the wholesale market and reduce building peak demand, this study aims to develop models that predict day-ahead hourly electricity consumption and demand using artificial neural network (ANN) and support vector machine (SVM). All prediction models are built in Python, with tool Scikit-learn and Pybrain. The input data for both consumption and demand prediction are time stamp, outdoor dry bulb temperature, relative humidity, air handling unit (AHU), supply air temperature and solar radiation. Solar radiation, which is unavailable a day-ahead, is predicted at first, and then this estimation is used as an input to predict consumption and demand. Models to predict consumption and demand are trained in both SVM and ANN, and depend on cooling or heating, weekdays or weekends. The results show that ANN is the better option for both consumption and demand prediction. It can achieve 15.50% to 20.03% coefficient of variance of root mean square error (CVRMSE) for consumption prediction and 22.89% to 32.42% CVRMSE for demand prediction, respectively. To conclude, the presented models have potential to help building owners to purchase electricity at the wholesale market, but they are not robust when used in demand response control.Keywords: building energy prediction, data mining, demand response, electricity market
Procedia PDF Downloads 317858 Superordinated Control for Increasing Feed-in Capacity and Improving Power Quality in Low Voltage Distribution Grids
Authors: Markus Meyer, Bastian Maucher, Rolf Witzmann
Abstract:
The ever increasing amount of distributed generation in low voltage distribution grids (mainly PV and micro-CHP) can lead to reverse load flows from low to medium/high voltage levels at times of high feed-in. Reverse load flow leads to rising voltages that may even exceed the limits specified in the grid codes. Furthermore, the share of electrical loads connected to low voltage distribution grids via switched power supplies continuously increases. In combination with inverter-based feed-in, this results in high harmonic levels reducing overall power quality. Especially high levels of third-order harmonic currents can lead to neutral conductor overload, which is even more critical if lines with reduced neutral conductor section areas are used. This paper illustrates a possible concept for smart grids in order to increase the feed-in capacity, improve power quality and to ensure safe operation of low voltage distribution grids at all times. The key feature of the concept is a hierarchically structured control strategy that is run on a superordinated controller, which is connected to several distributed grid analyzers and inverters via broad band powerline (BPL). The strategy is devised to ensure both quick response time as well as the technically and economically reasonable use of the available inverters in the grid (PV-inverters, batteries, stepless line voltage regulators). These inverters are provided with standard features for voltage control, e.g. voltage dependent reactive power control. In addition they can receive reactive power set points transmitted by the superordinated controller. To further improve power quality, the inverters are capable of active harmonic filtering, as well as voltage balancing, whereas the latter is primarily done by the stepless line voltage regulators. By additionally connecting the superordinated controller to the control center of the grid operator, supervisory control and data acquisition capabilities for the low voltage distribution grid are enabled, which allows easy monitoring and manual input. Such a low voltage distribution grid can also be used as a virtual power plant.Keywords: distributed generation, distribution grid, power quality, smart grid, virtual power plant, voltage control
Procedia PDF Downloads 267857 Removal of Heavy Metal, Dye and Salinity from Industrial Wastewaters by Banana Rachis Cellulose Micro Crystal-Clay Composite
Authors: Mohd Maniruzzaman, Md. Monjurul Alam, Md. Hafezur Rahaman, Anika Amir Mohona
Abstract:
The consumption of water by various industries is increasing day by day, and the wastewaters from them are increasing as well. These wastewaters consist of various kinds of color, dissolved solids, toxic heavy metals, residual chlorine, and other non-degradable organic materials. If these wastewaters are exposed directly to the environment, it will be hazardous for the environment and personal health. So, it is very necessary to treat these wastewaters before exposing into the environment. In this research, we have demonstrated the successful processing and utilization of fully bio-based cellulose micro crystal (CMC) composite for the removal of heavy metals, dyes, and salinity from industrial wastewaters. Banana rachis micro-cellulose were prepared by acid hydrolysis (H₂SO₄) of banana (Musa acuminata L.) rachis fiber, and Bijoypur raw clay were treated by organic solvent tri-ethyl amine. Composites were prepared with varying different composition of banana rachis nano-cellulose and modified Bijoypur (north-east part in Bangladesh) clay. After the successful characterization of cellulose micro crystal (CMC) and modified clay, our targeted filter was fabricated with different composition of cellulose micro crystal and clay in the locally fabricated packing column with 7.5 cm as thickness of composites fraction. Waste-water was collected from local small textile industries containing basic yellow 2 as dye, lead (II) nitrate [Pb(NO₃)₂] and chromium (III) nitrate [Cr(NO₃)₃] as heavy metals and saline water was collected from Khulna to test the efficiency of banana rachis cellulose micro crystal-clay composite for removing the above impurities. The filtering efficiency of wastewater purification was characterized by Fourier transforms infrared spectroscopy (FTIR), X-ray diffraction (X-RD), thermo gravimetric analysis (TGA), atomic absorption spectrometry (AAS), scanning electron microscopy (SEM) analyses. Finally, our all characterizations data are shown with very high expected results for in industrial application of our fabricated filter.Keywords: banana rachis, bio-based filter, cellulose micro crystal-clay composite, wastewaters, synthetic dyes, heavy metal, water salinity
Procedia PDF Downloads 129856 A PHREEQC Reactive Transport Simulation for Simply Determining Scaling during Desalination
Authors: Andrew Freiburger, Sergi Molins
Abstract:
Freshwater is a vital resource; yet, the supply of clean freshwater is diminishing as the consequence of melting snow and ice from global warming, pollution from industry, and an increasing demand from human population growth. The unsustainable trajectory of diminishing water resources is projected to jeopardize water security for billions of people in the 21st century. Membrane desalination technologies may resolve the growing discrepancy between supply and demand by filtering arbitrary feed water into a fraction of renewable, clean water and a fraction of highly concentrated brine. The leading hindrance of membrane desalination is fouling, whereby the highly concentrated brine solution encourages micro-organismal colonization and/or the precipitation of occlusive minerals (i.e. scale) upon the membrane surface. Thus, an understanding of brine formation is necessary to mitigate membrane fouling and to develop efficacious desalination technologies that can bolster the supply of available freshwater. This study presents a reactive transport simulation of brine formation and scale deposition during reverse osmosis (RO) desalination. The simulation conceptually represents the RO module as a one-dimensional domain, where feed water directionally enters the domain with a prescribed fluid velocity and is iteratively concentrated in the immobile layer of a dual porosity model. Geochemical PHREEQC code numerically evaluated the conceptual model with parameters for the BW30-400 RO module and for real water feed sources – e.g. the Red and Mediterranean seas, and produced waters from American oil-wells, based upon peer-review data. The presented simulation is computationally simpler, and hence less resource intensive, than the existent and more rigorous simulations of desalination phenomena, like TOUGHREACT. The end-user may readily prepare input files and execute simulations on a personal computer with open source software. The graphical results of fouling-potential and brine characteristics may therefore be particularly useful as the initial tool for screening candidate feed water sources and/or informing the selection of an RO module.Keywords: desalination, PHREEQC, reactive transport, scaling
Procedia PDF Downloads 136855 Study of Launch Recovery Control Dynamics of Retro Propulsive Reusable Rockets
Authors: Pratyush Agnihotri
Abstract:
The space missions are very costly because the transportation to the space is highly expensive and therefore there is the need to achieve complete re-usability in our launch vehicles to make the missions highly economic by cost cutting of the material recovered. Launcher reusability is the most efficient approach to decreasing admittance to space access economy, however stays an incredible specialized hurdle for the aerospace industry. Major concern of the difficulties lies in guidance and control procedure and calculations, specifically for those of the controlled landing stage, which should empower an exact landing with low fuel edges. Although cutting edge ways for navigation and control are present viz hybrid navigation and robust control. But for powered descent and landing of first stage of launch vehicle the guidance control is need to enable on board optimization. At first the CAD model of the launch vehicle I.e. space x falcon 9 rocket is presented for better understanding of the architecture that needs to be identified for the guidance and control solution for the recovery of the launcher. The focus is on providing the landing phase guidance scheme for recovery and re usability of first stage using retro propulsion. After reviewing various GNC solutions, to achieve accuracy in pre requisite landing online convex and successive optimization are explored as the guidance schemes.Keywords: guidance, navigation, control, retro propulsion, reusable rockets
Procedia PDF Downloads 93854 The Osteocutaneous Distal Tibia Turn-over Fillet Flap: A Novel Spare-parts Orthoplastic Surgery Option for Functional Below-knee Amputation
Authors: Harry Burton, Alexios Dimitrios Iliadis, Neil Jones, Aaron Saini, Nicola Bystrzonowski, Alexandros Vris, Georgios Pafitanis
Abstract:
This article portrays the authors’ experience with a complex lower limb bone and soft tissue defect, following chronic osteomyelitis and pathological fracture, which was managed by the multidisciplinary orthoplastic team. The decision for functional amputation versus limb salvage was deemed necessary, enhanced by the principles of “spares parts” in reconstructive microsurgery. This case describes a successful use of the osteocutaneous distal tibia turn-over fillet flap that allowed ‘lowering the level of the amputation’ from a through knee to the conventional level of a below-knee amputation to preserve the knee joint function. This case demonstrates the value of ‘spare-parts’ surgery principles and how these concepts refine complex orthoplastic approaches when limb salvage is not possible to enhance function. The osteocutaneous distal tibia turn-over fillet flap is a robust technique for modified BKA reconstructions that provides sufficient bone length to achieve a tough, sensate stump and functional knee joint.Keywords: osteocutaneous flap, fillet flap, spare-parts surgery, Below knee amputation
Procedia PDF Downloads 168853 The Application of Insects in Forensic Investigations
Authors: Shirin Jalili, Hadi Shirzad, Samaneh Nabavi, Somayeh Khanjani
Abstract:
Forensic entomology is the science of study and analysis of insects evidences to aid in criminal investigation. Being aware of the distribution, biology, ecology and behavior of insects, which are founded at crime scene can provide information about when, where and how the crime has been committed. It has many application in criminal investigations. Its main use is estimation of the minimum time after death in suspicious death. The close association between insects and corpses and the use of insects in criminal investigations is the subject of forensic entomology. Because insects attack to the decomposing corpse and spawning on it from the initial stages. Forensic scientists can estimate the postmortem index by studying the insects population and the developing larval stages.In addition, toxicological and molecular studies of these insects can reveal the cause of death or even the identity of a victim. It also be used to detect drugs and poisons, and determination of incident location. Gathering robust entomological evidences is made possible for experts by recent Techniques. They can provide vital information about death, corpse movement or burial, submersion interval, time of decapitation, identification of specific sites of trauma, post-mortem artefacts on the body, use of drugs, linking a suspect to the scene of a crime, sexual molestations and the identification of suspects.Keywords: Forensic entomology, post mortem interval, insects, larvae
Procedia PDF Downloads 503852 Social Entrepreneurship and Inclusive Growth
Authors: Sudheer Gupta
Abstract:
Approximately 4 billion citizens of the world live on the equivalent of less than $8 a day. This segment constitutes a $5 trillion global market that remains under-served. Multinational corporations have historically tended to focus their innovation efforts on the upper segments of the economic pyramid. The academic literature has also been dominated by theories and frameworks of innovation that are valid when applied to the developed markets and consumer segments, but fail to adequately account for the challenges and realities of new product and service creation for the poor. Theories of entrepreneurship developed in the context of developed markets similarly ignore the challenges and realities of operating in developing economies that can be characterized by missing institutions, missing markets, information and infrastructural challenges, and resource constraints. Social entrepreneurs working in such contexts develop solutions differently. In this talk, we summarize lessons learnt from a long-term research project that involves data collection from a broad range of social entrepreneurs in developing countries working towards solutions to alleviate poverty, and grounded theory-building efforts. We aim to develop a better understanding of consumers, producers, and other stakeholder involvement, thus laying the foundation to build a robust theory of innovation and entrepreneurship for the poor.Keywords: poverty alleviation, social enterprise, social innovation, development
Procedia PDF Downloads 401851 Brain Tumor Detection and Classification Using Pre-Trained Deep Learning Models
Authors: Aditya Karade, Sharada Falane, Dhananjay Deshmukh, Vijaykumar Mantri
Abstract:
Brain tumors pose a significant challenge in healthcare due to their complex nature and impact on patient outcomes. The application of deep learning (DL) algorithms in medical imaging have shown promise in accurate and efficient brain tumour detection. This paper explores the performance of various pre-trained DL models ResNet50, Xception, InceptionV3, EfficientNetB0, DenseNet121, NASNetMobile, VGG19, VGG16, and MobileNet on a brain tumour dataset sourced from Figshare. The dataset consists of MRI scans categorizing different types of brain tumours, including meningioma, pituitary, glioma, and no tumour. The study involves a comprehensive evaluation of these models’ accuracy and effectiveness in classifying brain tumour images. Data preprocessing, augmentation, and finetuning techniques are employed to optimize model performance. Among the evaluated deep learning models for brain tumour detection, ResNet50 emerges as the top performer with an accuracy of 98.86%. Following closely is Xception, exhibiting a strong accuracy of 97.33%. These models showcase robust capabilities in accurately classifying brain tumour images. On the other end of the spectrum, VGG16 trails with the lowest accuracy at 89.02%.Keywords: brain tumour, MRI image, detecting and classifying tumour, pre-trained models, transfer learning, image segmentation, data augmentation
Procedia PDF Downloads 74850 Wearable Interface for Telepresence in Robotics
Authors: Uriel Martinez-Hernandez, Luke W. Boorman, Hamideh Kerdegari, Tony J. Prescott
Abstract:
In this paper, we present architecture for the study of telepresence, immersion and human-robot interaction. The architecture is built around a wearable interface, developed here, that provides the human with visual, audio and tactile feedback from a remote location. We have chosen to interface the system with the iCub humanoid robot, as it mimics many human sensory modalities, such as vision, with gaze control and tactile feedback. This allows for a straightforward integration of multiple sensory modalities, but also offers a more complete immersion experience for the human. These systems are integrated, controlled and synchronised by an architecture developed for telepresence and human-robot interaction. Our wearable interface allows human participants to observe and explore a remote location, while also being able to communicate verbally with humans located in the remote environment. Our approach has been tested from local, domestic and business venues, using wired, wireless and Internet based connections. This has involved the implementation of data compression to maintain data quality to improve the immersion experience. Initial testing has shown the wearable interface to be robust. The system will endow humans with the ability to explore and interact with other humans at remote locations using multiple sensing modalities.Keywords: telepresence, telerobotics, human-robot interaction, virtual reality
Procedia PDF Downloads 290849 Robust Quantum Image Encryption Algorithm Leveraging 3D-BNM Chaotic Maps and Controlled Qubit-Level Operations
Authors: Vivek Verma, Sanjeev Kumar
Abstract:
This study presents a novel quantum image encryption algorithm, using a 3D chaotic map and controlled qubit-level scrambling operations. The newly proposed 3D-BNM chaotic map effectively reduces the degradation of chaotic dynamics resulting from the finite word length effect. It facilitates the generation of highly unpredictable random sequences and enhances chaotic performance. The system’s efficacy is additionally enhanced by the inclusion of a SHA-256 hash function. Initially, classical plain images are converted into their quantum equivalents using the Novel Enhanced Quantum Representation (NEQR) model. The Generalized Quantum Arnold Transformation (GQAT) is then applied to disrupt the coordinate information of the quantum image. Subsequently, to diffuse the pixel values of the scrambled image, XOR operations are performed using pseudorandom sequences generated by the 3D-BNM chaotic map. Furthermore, to enhance the randomness and reduce the correlation among the pixels in the resulting cipher image, a controlled qubit-level scrambling operation is employed. The encryption process utilizes fundamental quantum gates such as C-NOT and CCNOT. Both theoretical and numerical simulations validate the effectiveness of the proposed algorithm against various statistical and differential attacks. Moreover, the proposed encryption algorithm operates with low computational complexity.Keywords: 3D Chaotic map, SHA-256, quantum image encryption, Qubit level scrambling, NEQR
Procedia PDF Downloads 14848 Robust Recognition of Locomotion Patterns via Data-Driven Machine Learning in the Cloud Environment
Authors: Shinoy Vengaramkode Bhaskaran, Kaushik Sathupadi, Sandesh Achar
Abstract:
Human locomotion recognition is important in a variety of sectors, such as robotics, security, healthcare, fitness tracking and cloud computing. With the increasing pervasiveness of peripheral devices, particularly Inertial Measurement Units (IMUs) sensors, researchers have attempted to exploit these advancements in order to precisely and efficiently identify and categorize human activities. This research paper introduces a state-of-the-art methodology for the recognition of human locomotion patterns in a cloud environment. The methodology is based on a publicly available benchmark dataset. The investigation implements a denoising and windowing strategy to deal with the unprocessed data. Next, feature extraction is adopted to abstract the main cues from the data. The SelectKBest strategy is used to abstract optimal features from the data. Furthermore, state-of-the-art ML classifiers are used to evaluate the performance of the system, including logistic regression, random forest, gradient boosting and SVM have been investigated to accomplish precise locomotion classification. Finally, a detailed comparative analysis of results is presented to reveal the performance of recognition models.Keywords: artificial intelligence, cloud computing, IoT, human locomotion, gradient boosting, random forest, neural networks, body-worn sensors
Procedia PDF Downloads 13847 An Empirical Evaluation of Performance of Machine Learning Techniques on Imbalanced Software Quality Data
Authors: Ruchika Malhotra, Megha Khanna
Abstract:
The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.Keywords: change proneness, empirical validation, imbalanced learning, machine learning techniques, object-oriented metrics
Procedia PDF Downloads 418846 Facile Synthesis and Structure Characterization of Europium (III) Tungstate Nanoparticles
Authors: Mehdi Rahimi-Nasrabadi, Seied Mahdi Pourmortazavi
Abstract:
Taguchi robust design as a statistical method was applied for optimization of the process parameters in order to tunable, simple and fast synthesis of europium (III) tungstate nanoparticles. Europium (III) tungstate nanoparticles were synthesized by a chemical precipitation reaction involving direct addition of europium ion aqueous solution to the tungstate reagent solved in aqueous media. Effects of some synthesis procedure variables i.e., europium and tungstate concentrations, flow rate of cation reagent addition, and temperature of reaction reactor on the particle size of europium (III) tungstate nanoparticles were studied experimentally in order to tune particle size of europium (III) tungstate. Analysis of variance shows the importance of controlling tungstate concentration, cation feeding flow rate and temperature for preparation of europium (III) tungstate nanoparticles by the proposed chemical precipitation reaction. Finally, europium (III) tungstate nanoparticles were synthesized at the optimum conditions of the proposed method and the morphology and chemical composition of the prepared nano-material were characterized by means of X-Ray diffraction, scanning electron microscopy, transmission electron microscopy, FT-IR spectroscopy, and fluorescence.Keywords: europium (III) tungstate, nano-material, particle size control, procedure optimization
Procedia PDF Downloads 395845 Simultaneous Quantification of Glycols in New and Recycled Anti-Freeze Liquids by GC-MS
Authors: George Madalin Danila, Mihaiella Cretu, Cristian Puscasu
Abstract:
Glycol-based anti-freeze liquids, commonly composed of ethylene glycol or propylene glycol, have important uses in automotive cooling, but they should be handled with care due to their toxicity; ethylene glycol is highly toxic to humans and animals. A fast, accurate, precise, and robust method was developed for the simultaneous quantification of 7 most important glycols and their isomers. Glycols were analyzed from diluted sample solution of coolants using gas-chromatography coupled with mass spectrometry in single ion monitoring mode. Results: The method was developed and validated for 7 individual glycols (ethylene glycol, diethylene glycol, triethylene glycol, tetraethylene glycol, propylene glycol, dipropylene glycol and tripropylene glycol). Limits of detection (1-2 μg/mL) and limit of quantification (10 μg/mL) obtained were appropriate. The present method was applied for the determination of glycols in 10 different anti-freeze liquids commercially available on the Romanian market, proving to be reliable. A method that requires only a two-step dilution of anti-freeze samples combined with direct liquid injection GC-MS was validated for the simultaneous quantification of 7 glycols (and their isomers) in 10 different types of anti-freeze liquids. The results obtained in the validation procedure proved that the GC-MS method is sensitive and precise for the quantification of glycols.Keywords: glycols, anti-freeze, gas-chromatography, mass spectrometry, validation, recycle
Procedia PDF Downloads 67844 The Regulation of Reputational Information in the Sharing Economy
Authors: Emre Bayamlıoğlu
Abstract:
This paper aims to provide an account of the legal and the regulative aspects of the algorithmic reputation systems with a special emphasis on the sharing economy (i.e., Uber, Airbnb, Lyft) business model. The first section starts with an analysis of the legal and commercial nature of the tripartite relationship among the parties, namely, the host platform, individual sharers/service providers and the consumers/users. The section further examines to what extent an algorithmic system of reputational information could serve as an alternative to legal regulation. Shortcomings are explained and analyzed with specific examples from Airbnb Platform which is a pioneering success in the sharing economy. The following section focuses on the issue of governance and control of the reputational information. The section first analyzes the legal consequences of algorithmic filtering systems to detect undesired comments and how a delicate balance could be struck between the competing interests such as freedom of speech, privacy and the integrity of the commercial reputation. The third section deals with the problem of manipulation by users. Indeed many sharing economy businesses employ certain techniques of data mining and natural language processing to verify consistency of the feedback. Software agents referred as "bots" are employed by the users to "produce" fake reputation values. Such automated techniques are deceptive with significant negative effects for undermining the trust upon which the reputational system is built. The third section is devoted to explore the concerns with regard to data mobility, data ownership, and the privacy. Reputational information provided by the consumers in the form of textual comment may be regarded as a writing which is eligible to copyright protection. Algorithmic reputational systems also contain personal data pertaining both the individual entrepreneurs and the consumers. The final section starts with an overview of the notion of reputation as a communitarian and collective form of referential trust and further provides an evaluation of the above legal arguments from the perspective of public interest in the integrity of reputational information. The paper concludes with certain guidelines and design principles for algorithmic reputation systems, to address the above raised legal implications.Keywords: sharing economy, design principles of algorithmic regulation, reputational systems, personal data protection, privacy
Procedia PDF Downloads 467