Search results for: automatic selective door operations
1566 Development of Automatic Laser Scanning Measurement Instrument
Authors: Chien-Hung Liu, Yu-Fen Chen
Abstract:
This study used triangular laser probe and three-axial direction mobile platform for surface measurement, programmed it and applied it to real-time analytic statistics of different measured data. This structure was used to design a system integration program: using triangular laser probe for scattering or reflection non-contact measurement, transferring the captured signals to the computer through RS-232, and using RS-485 to control the three-axis platform for a wide range of measurement. The data captured by the laser probe are formed into a 3D surface. This study constructed an optical measurement application program in the concept of visual programming language. First, the signals are transmitted to the computer through RS-232/RS-485, and then the signals are stored and recorded in graphic interface timely. This programming concept analyzes various messages, and makes proper presentation graphs and data processing to provide the users with friendly graphic interfaces and data processing state monitoring, and identifies whether the present data are normal in graphic concept. The major functions of the measurement system developed by this study are thickness measurement, SPC, surface smoothness analysis, and analytical calculation of trend line. A result report can be made and printed promptly. This study measured different heights and surfaces successfully, performed on-line data analysis and processing effectively, and developed a man-machine interface for users to operate.Keywords: laser probe, non-contact measurement, triangulation measurement principle, statistical process control, labVIEW
Procedia PDF Downloads 3661565 Efficiency on the Enteric Viral Removal in Four Potable Water Treatment Plants in Northeastern Colombia
Authors: Raquel Amanda Villamizar Gallardo, Oscar Orlando Ortíz Rodríguez
Abstract:
Enteric viruses are cosmopolitan agents present in several environments including water. These viruses can cause different diseases including gastroenteritis, hepatitis, conjunctivitis, respiratory problems among others. Although in Colombia there are not regulations concerning to routine viral analysis of drinking water, an enhanced understanding of viral pollution and resistance to treatments is desired in order to assure pure water to the population. Viral detection is often complex due to the need of specialized and time-consuming procedures. In addition, viruses are highly diluted in water which is a drawback from the analytical point of view. To this end, a fast and selective detection method for detection enteric viruses (i.e. Hepatitis A and Rotavirus) were applied. Micro- magnetic particles were functionalized with monoclonal antibodies anti-Hepatitis and anti-Rotavirus and they were used to capture, concentrate and separate whole viral particles in raw water and drinking water samples from four treatment plants identified as CAR-01, MON-02, POR-03, TON-04 and located in the Northeastern Colombia. Viruses were molecularly by using RT-PCR One Step Superscript III. Each plant was analyzed at the entry and exit points, in order to determine the initial presence and eventual reduction of Hepatitis A and Rotavirus after disinfection. The results revealed the presence of both enteric viruses in a 100 % of raw water analyzed in all plants. This represents a potential health hazard, especially for those people whose use this water for agricultural purposes. However, in drinking water analysis, enteric viruses was only positive in CAR-01, where was found the presence of Rotavirus. As a conclusion, the results confirm Rotavirus as the best indicator to evaluate the efficacy of potable treatment plant in eliminating viruses. CAR potable water plant should improve their disinfection process in order to remove efficiently enteric viruses.Keywords: drinking water, hepatitis A, rotavirus, virus removal
Procedia PDF Downloads 2341564 Experimental Investigation for Reducing Emissions in Maritime Industry
Authors: Mahmoud Ashraf Farouk
Abstract:
Shipping transportation is the foremost imperative mode of transportation in universal coordination. At display, more than 2/3 of the full worldwide exchange volume accounts for shipping transportation. Ships are utilized as an implies of marine transportation, introducing large-power diesel motors with exhaust containing nitrogen oxide NOx, sulfur oxide SOx, carbo di-oxide CO₂, particular matter PM10, hydrocarbon HC and carbon mono-oxide CO which are the most dangerous contaminants found in exhaust gas from ships. Ships radiating a large amount of exhaust gases have become a significant cause of pollution in the air in coastal areas, harbors and oceans. Therefore, IMO (the International Maritime Organization) has established rules to reduce this emission. This experiment shows the measurement of the exhaust gases emitted from the Aida IV ship's main engine using marine diesel oil fuel (MDO). The measurement is taken by the Sensonic2000 device on 85% load, which is the main sailing load. Moreover, the paper studies different emission reduction technologies as an alternative fuel, which as liquefied natural gas (LNG) applied to the system and reduction technology which is represented as selective catalytic reduction technology added to the marine diesel oil system (MDO+SCR). The experiment calculated the amount of nitrogen oxide NOx, sulfur oxide SOx, carbon-di-oxide CO₂, particular matter PM10, hydrocarbon HC and carbon mono-oxide CO because they have the most effect on the environment. The reduction technologies are applied on the same ship engine with the same load. Finally, the study found that MDO+SCR is the more efficient technology for the Aida IV ship as a training and supply ship due to low consumption and no need to modify the engine. Just add the SCR system to the exhaust line, which is easy and cheapest. Moreover, the differences between them in the emission are not so big.Keywords: marine, emissions, reduction, shipping
Procedia PDF Downloads 791563 Hand Gesture Recognition for Sign Language: A New Higher Order Fuzzy HMM Approach
Authors: Saad M. Darwish, Magda M. Madbouly, Murad B. Khorsheed
Abstract:
Sign Languages (SL) are the most accomplished forms of gestural communication. Therefore, their automatic analysis is a real challenge, which is interestingly implied to their lexical and syntactic organization levels. Hidden Markov models (HMM’s) have been used prominently and successfully in speech recognition and, more recently, in handwriting recognition. Consequently, they seem ideal for visual recognition of complex, structured hand gestures such as are found in sign language. In this paper, several results concerning static hand gesture recognition using an algorithm based on Type-2 Fuzzy HMM (T2FHMM) are presented. The features used as observables in the training as well as in the recognition phases are based on Singular Value Decomposition (SVD). SVD is an extension of Eigen decomposition to suit non-square matrices to reduce multi attribute hand gesture data to feature vectors. SVD optimally exposes the geometric structure of a matrix. In our approach, we replace the basic HMM arithmetic operators by some adequate Type-2 fuzzy operators that permits us to relax the additive constraint of probability measures. Therefore, T2FHMMs are able to handle both random and fuzzy uncertainties existing universally in the sequential data. Experimental results show that T2FHMMs can effectively handle noise and dialect uncertainties in hand signals besides a better classification performance than the classical HMMs. The recognition rate of the proposed system is 100% for uniform hand images and 86.21% for cluttered hand images.Keywords: hand gesture recognition, hand detection, type-2 fuzzy logic, hidden Markov Model
Procedia PDF Downloads 4661562 Synthesis of Flexible Mn1-x-y(CexLay)O2-δ Ultrathin-Film Device for Highly-Stable Pseudocapacitance from end-of-life Ni-MH batteries
Authors: Samane Maroufi, Rasoul Khayyam Nekouei, Sajjad Sefimofarah, Veena Sahajwalla
Abstract:
The present work details a three-stage strategy based on selective purification of rare earth oxide (REOs) isolated from end-of-life nickel-metal hydride (Ni-MH) batteries leading to high-yield fabrication of defect-rich Mn1-x-y(CeₓLaᵧ)O2-δ film. In step one, major impurities (Fe and Al) were removed from a REE-rich solution. In step two, the resulting solution with trace content of Mn was further purified through electrodeposition which resulted in the synthesis of a non-stoichiometric Mn₋₁₋ₓ₋ᵧ(CeₓLaₓᵧ)O2-δ ultra-thin film, with controllable thicknesses (5-650 nm) and transmittance (~29-100%)in which Ce4+/3+ and La3+ ions were dissolved in MnO2-x lattice. Due to percolation impacts on the optoelectronic properties of ultrathin films, a representative Mn1-x-y(CexLay)O2-δ film with 86% transmittance exhibited an outstanding areal capacitance of 3.4 mF•cm-2, mainly attributed to the intercalation/de-intercalation of anionic O2- charge carriers through the atomic tunnels of the stratified Mn1-x-y(CexLay)O2-δ crystallites. Furthermore, the Mn1-x-y(CexLay)O2-δ exhibited excellent capacitance retention of ~90% after 16,000 cycles. Such stability was shown to be associated with intervalence charge transfers occurring among interstitial Ce/La cations and Mn oxidation states within the Mn₋₁₋ₓ₋ᵧ(CexLay)O2-δ structure. The energy and power densities of the transparent flexible Mn₋₁₋ₓ₋ᵧ(CexLay)O2-δ full-cell pseudocapacitor device with a solid-state electrolyte was measured to be 0.088 µWh.cm-2 and 843 µW.cm-2, respectively. These values showed insignificant changes under vigorous twisting and bending to 45-180˚, confirming these materials are intriguing alternatives for size-sensitive energy storage devices. In step three, the remaining solution purified further, that led to the formation of REOs (La, Ce, and Nd) nanospheres with ~40-50 nm diameter.Keywords: spent Ni-MH batteries, green energy, flexible pseudocapacitor, rare earth elements
Procedia PDF Downloads 1371561 An Automatic Bayesian Classification System for File Format Selection
Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan
Abstract:
This paper presents an approach for the classification of an unstructured format description for identification of file formats. The main contribution of this work is the employment of data mining techniques to support file format selection with just the unstructured text description that comprises the most important format features for a particular organisation. Subsequently, the file format indentification method employs file format classifier and associated configurations to support digital preservation experts with an estimation of required file format. Our goal is to make use of a format specification knowledge base aggregated from a different Web sources in order to select file format for a particular institution. Using the naive Bayes method, the decision support system recommends to an expert, the file format for his institution. The proposed methods facilitate the selection of file format and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and specifications of file formats. To facilitate decision-making, the aggregated information about the file formats is presented as a file format vocabulary that comprises most common terms that are characteristic for all researched formats. The goal is to suggest a particular file format based on this vocabulary for analysis by an expert. The sample file format calculation and the calculation results including probabilities are presented in the evaluation section.Keywords: data mining, digital libraries, digital preservation, file format
Procedia PDF Downloads 5011560 Social Networks And Social Complexity: The Southern Italian Drive For Trade Exchange During The Late Bronze Age
Authors: Sara Fioretti
Abstract:
During the Middle Bronze Age, southern Italy underwent a reorganisation of social structures where local cultures, such as the sub-Apennine and Nuragic, flourished and participated in maritime trade. This paper explores the socio-economic relationships, in both cross-cultural and potentially inter-regional settings, present within the archaeological repertoire of the southern Italian Late Bronze Age (LBA 1600 -1050 BCE). The emergence of economic relations within the connectivity of the regional settlements is explored through ceramic contexts found in the case studies Punta di Zambrone, Broglio di Trebisacce, and Nuraghe Antigori. This paper discusses the findings of a statistical and theoretical approach from an ongoing study in relation to the Mediterranean’s characterisation as a period dominated by Mycenaean influence. This study engages with a theoretical bricolage of Social Networks Entanglement, and Assertive Objects Theory to address the selective and assertive dynamics evident in the cross-cultural trade exchanges as well as consider inter-regional dynamics. Through this intersection of theory and statistical analysis, the case studies establish a small percentage of pottery as imported, whilst assertive productions have a relatively higher quantity. Overall, the majority still adheres to regional Italian traditions. Therefore, we can dissect the rhizomatic relationships cultivated by the Italian coasts and Mycenaeans and their roles within their networks through the intersection of theoretical and statistical analysis. This research offers a new perspective on the complex nature of the Late Bronze Age relational structures.Keywords: late bronze age, mediterranean archaeology, exchanges and trade, frequency distribution of ceramic assemblages, social network theory, rhizomatic exchanges
Procedia PDF Downloads 511559 Spatial-Temporal Awareness Approach for Extensive Re-Identification
Authors: Tyng-Rong Roan, Fuji Foo, Wenwey Hseush
Abstract:
Recent development of AI and edge computing plays a critical role to capture meaningful events such as detection of an unattended bag. One of the core problems is re-identification across multiple CCTVs. Immediately following the detection of a meaningful event is to track and trace the objects related to the event. In an extensive environment, the challenge becomes severe when the number of CCTVs increases substantially, imposing difficulties in achieving high accuracy while maintaining real-time performance. The algorithm that re-identifies cross-boundary objects for extensive tracking is referred to Extensive Re-Identification, which emphasizes the issues related to the complexity behind a great number of CCTVs. The Spatial-Temporal Awareness approach challenges the conventional thinking and concept of operations which is labor intensive and time consuming. The ability to perform Extensive Re-Identification through a multi-sensory network provides the next-level insights – creating value beyond traditional risk management.Keywords: long-short-term memory, re-identification, security critical application, spatial-temporal awareness
Procedia PDF Downloads 1171558 Mastering Test Automation: Bridging Gaps for Seamless QA
Authors: Rohit Khankhoje
Abstract:
The rapid evolution of software development practices has given rise to an increasing demand for efficient and effective test automation. The paper titled "Mastering Test Automation: Bridging Gaps for Seamless QA" delves into the crucial aspects of test automation, addressing the obstacles faced by organizations in achieving flawless quality assurance. The paper highlights the importance of bridging knowledge gaps within organizations, emphasizing the necessity for management to acquire a deeper comprehension of test automation scenarios, coverage, report trends, and the importance of communication. To tackle these challenges, this paper introduces innovative solutions, including the development of an automation framework that seamlessly integrates with test cases and reporting tools like TestRail and Jira. This integration facilitates the automatic recording of bugs in Jira, enhancing bug reporting and communication between manual QA and automation teams as well as TestRail have all newly added automated testcases as soon as it is part of the automation suite. The paper demonstrates how this framework empowers management by providing clear insights into ongoing automation activities, bug origins, trend analysis, and test case specifics. "Mastering Test Automation" serves as a comprehensive guide for organizations aiming to enhance their quality assurance processes through effective test automation. It not only identifies the common pitfalls and challenges but also offers practical solutions to bridge the gaps, resulting in a more streamlined and efficient QA process.Keywords: automation framework, API integration, test automation, test management tools
Procedia PDF Downloads 791557 Perspectives of Computational Modeling in Sanskrit Lexicons
Authors: Baldev Ram Khandoliyan, Ram Kishor
Abstract:
India has a classical tradition of Sanskrit Lexicons. Research work has been done on the study of Indian lexicography. India has seen amazing strides in Information and Communication Technology (ICT) applications for Indian languages in general and for Sanskrit in particular. Since Machine Translation from Sanskrit to other Indian languages is often the desired goal, traditional Sanskrit lexicography has attracted a lot of attention from the ICT and Computational Linguistics community. From Nighaŋţu and Nirukta to Amarakośa and Medinīkośa, Sanskrit owns a rich history of lexicography. As these kośas do not follow the same typology or standard in the selection and arrangement of the words and the information related to them, several types of Kośa-styles have emerged in this tradition. The model of a grammar given by Aṣṭādhyāyī is well appreciated by Indian and western linguists and grammarians. But the different models provided by lexicographic tradition also have importance. The general usefulness of Sanskrit traditional Kośas is well discussed by some scholars. That is most of the matter made available in the text. Some also have discussed the good arrangement of lexica. This paper aims to discuss some more use of the different models of Sanskrit lexicography especially focusing on its computational modeling and its use in different computational operations.Keywords: computational lexicography, Sanskrit Lexicons, nighanṭu, kośa, Amarkosa
Procedia PDF Downloads 1691556 Industry 4.0 and Supply Chain Integration: Case of Tunisian Industrial Companies
Authors: Rym Ghariani, Ghada Soltane, Younes Boujelbene
Abstract:
Industry 4.0, a set of emerging smart and digital technologies, has been the main focus of operations management researchers and practitioners in recent years. The objective of this research paper is to study the impact of Industry 4.0 on the integration of the supply chain (SCI) in Tunisian industrial companies. A conceptual model to study the relationship between Industry 4.0 technologies and supply chain integration was designed. This model contains three explained variables (Big data, Internet of Things, and Robotics) and one variable to be explained (supply chain integration). In order to answer our research questions and investigate the research hypotheses, principal component analysis and discriminant analysis were used using SPSS26 software. The results reveal that there is a statistically positive impact significant impact of Industry 4.0 (Big data, Internet of Things and Robotics) on the integration of the supply chain. Interestingly, big data has a greater positive impact on supply chain integration than the Internet of Things and robotics.Keywords: industry 4.0 (I4.0), big data, internet of things, robotics, supply chain integration
Procedia PDF Downloads 651555 Secure Cryptographic Operations on SIM Card for Mobile Financial Services
Authors: Kerem Ok, Serafettin Senturk, Serdar Aktas, Cem Cevikbas
Abstract:
Mobile technology is very popular nowadays and it provides a digital world where users can experience many value-added services. Service Providers are also eager to offer diverse value-added services to users such as digital identity, mobile financial services and so on. In this context, the security of data storage in smartphones and the security of communication between the smartphone and service provider are critical for the success of these services. In order to provide the required security functions, the SIM card is one acceptable alternative. Since SIM cards include a Secure Element, they are able to store sensitive data, create cryptographically secure keys, encrypt and decrypt data. In this paper, we design and implement a SIM and a smartphone framework that uses a SIM card for secure key generation, key storage, data encryption, data decryption and digital signing for mobile financial services. Our frameworks show that the SIM card can be used as a controlled Secure Element to provide required security functions for popular e-services such as mobile financial services.Keywords: SIM card, mobile financial services, cryptography, secure data storage
Procedia PDF Downloads 3161554 Integration GIS–SCADA Power Systems to Enclosure Air Dispersion Model
Authors: Ibrahim Shaker, Amr El Hossany, Moustafa Osman, Mohamed El Raey
Abstract:
This paper will explore integration model between GIS–SCADA system and enclosure quantification model to approach the impact of failure-safe event. There are real demands to identify spatial objects and improve control system performance. Nevertheless, the employed methodology is predicting electro-mechanic operations and corresponding time to environmental incident variations. Open processing, as object systems technology, is presented for integration enclosure database with minimal memory size and computation time via connectivity drivers such as ODBC:JDBC during main stages of GIS–SCADA connection. The function of Geographic Information System is manipulating power distribution in contrast to developing issues. In other ward, GIS-SCADA systems integration will require numerical objects of process to enable system model calibration and estimation demands, determine of past events for analysis and prediction of emergency situations for response training.Keywords: air dispersion model, environmental management, SCADA systems, GIS system, integration power system
Procedia PDF Downloads 3711553 Survey of the Literacy by Radio Project as an Innovation in Literacy Promotion in Nigeria
Authors: Stella Chioma Nwizu
Abstract:
The National Commission for Adult and Non Formal Education (NMEC) in Nigeria is charged with the reduction of illiteracy rate through the development, monitoring, and supervision of literacy programmes in Nigeria. In spite of various efforts by NMEC to reduce illiteracy, literature still shows that the illiteracy rate is still high. According to NMEC/UNICEF, about 60 million Nigerians are non-literate, and nearly two thirds of them are women. This situation forced the government to search for innovative and better approaches to literacy promotion and delivery. The literacy by radio project was adopted as an innovative intervention to literacy delivery in Nigeria because the radio is the cheapest and most easily affordable medium for non-literates. The project aimed at widening access to literacy programmes for the non-literate marginalized and disadvantaged groups in Nigeria by taking literacy programmes to their door steps. The literacy by radio has worked perfectly well in non-literacy reduction in Cuba. This innovative intervention of literacy by radio is anchored on the diffusion of innovation theory by Rogers. The literacy by radio has been going on for fifteen years and the efficacy and contributions of this innovation need to be investigated. Thus, the purpose of this research is to review the contributions of the literacy by radio in Nigeria. The researcher adopted the survey research design for the study. The population for the study consisted of 2,706 participants and 47 facilitators of the literacy by radio programme in the 10 pilot states in Nigeria. A sample of four states made up of 302 participants and eight facilitators were used for the study. Information was collected through Focus Group Discussion (FGD), interviews and content analysis of official documents. The data were analysed qualitatively to review the contributions of literacy by radio project and determine the efficacy of this innovative approach in facilitating literacy in Nigeria. Results from the field experience showed, among others, that more non-literates have better access to literacy programmes through this innovative approach. The pilot project was 88% successful; not less than 2,110 adults were made literate through the literacy by radio project in 2017. However, lack of enthusiasm and commitment on the part of the technical committee and facilitators due to non-payment of honorarium, poor signals from radio stations, interruption of lectures with adverts, low community involvement in decision making in the project are challenges to the success rate of the project. The researcher acknowledges the need to customize all materials and broadcasts in all the dialects of the participants and the inclusion of more civil rights, environmental protection and agricultural skills into the project. The study recommends among others, improved and timely funding of the project by the Federal Government to enable NMEC to fulfill her obligations towards the greater success of the programme, setting up of independent radio stations for airing the programmes and proper monitoring and evaluation of the project by NMEC and State Agencies for greater effectiveness. In an era of the knowledge-driven economy, no one should be allowed to get saddled with the weight of illiteracy.Keywords: innovative approach, literacy, project, radio, survey
Procedia PDF Downloads 731552 The Role of Healthcare Informatics in Combating the COVID-19 Pandemic
Authors: Philip Eappen, Narasimha Rao Vajjhala
Abstract:
This chapter examines how healthcare organizations harnessed innovative healthcare informatics to navigate the challenges posed by the COVID-19 pan-demic, addressing critical needs and improving care delivery. The pandemic's un-precedented demands necessitated the adoption of new and advanced tools to manage healthcare operations more effectively. Informatics solutions played a crucial role in facilitating the smooth functioning of healthcare systems during this crisis and are anticipated to remain central to future healthcare management. Technologies such as telemedicine helped healthcare professionals minimize ex-posure to COVID-19 patients, thereby reducing infection risks within healthcare facilities. This chapter explores a range of informatics applications utilized worldwide, including telemedicine, AI-driven solutions, big data analytics, drones, robots, and digital platforms for drug delivery, all of which enabled re-mote patient care and enhanced healthcare accessibility and safety during the pan-demic.Keywords: healthcare informatics, COVID-19 Pandemic, telemedicine, AI-driven healthcare, big data analytics, remote patient care, digital health platforms
Procedia PDF Downloads 171551 Irradion: Portable Small Animal Imaging and Irradiation Unit
Authors: Josef Uher, Jana Boháčová, Richard Kadeřábek
Abstract:
In this paper, we present a multi-robot imaging and irradiation research platform referred to as Irradion, with full capabilities of portable arbitrary path computed tomography (CT). Irradion is an imaging and irradiation unit entirely based on robotic arms for research on cancer treatment with ion beams on small animals (mice or rats). The platform comprises two subsystems that combine several imaging modalities, such as 2D X-ray imaging, CT, and particle tracking, with precise positioning of a small animal for imaging and irradiation. Computed Tomography: The CT subsystem of the Irradion platform is equipped with two 6-joint robotic arms that position a photon counting detector and an X-ray tube independently and freely around the scanned specimen and allow image acquisition utilizing computed tomography. Irradiation measures nearly all conventional 2D and 3D trajectories of X-ray imaging with precisely calibrated and repeatable geometrical accuracy leading to a spatial resolution of up to 50 µm. In addition, the photon counting detectors allow X-ray photon energy discrimination, which can suppress scattered radiation, thus improving image contrast. It can also measure absorption spectra and recognize different materials (tissue) types. X-ray video recording and real-time imaging options can be applied for studies of dynamic processes, including in vivo specimens. Moreover, Irradion opens the door to exploring new 2D and 3D X-ray imaging approaches. We demonstrate in this publication various novel scan trajectories and their benefits. Proton Imaging and Particle Tracking: The Irradion platform allows combining several imaging modules with any required number of robots. The proton tracking module comprises another two robots, each holding particle tracking detectors with position, energy, and time-sensitive sensors Timepix3. Timepix3 detectors can track particles entering and exiting the specimen and allow accurate guiding of photon/ion beams for irradiation. In addition, quantifying the energy losses before and after the specimen brings essential information for precise irradiation planning and verification. Work on the small animal research platform Irradion involved advanced software and hardware development that will offer researchers a novel way to investigate new approaches in (i) radiotherapy, (ii) spectral CT, (iii) arbitrary path CT, (iv) particle tracking. The robotic platform for imaging and radiation research developed for the project is an entirely new product on the market. Preclinical research systems with precision robotic irradiation with photon/ion beams combined with multimodality high-resolution imaging do not exist currently. The researched technology can potentially cause a significant leap forward compared to the current, first-generation primary devices.Keywords: arbitrary path CT, robotic CT, modular, multi-robot, small animal imaging
Procedia PDF Downloads 941550 Architectural Engineering and Executive Design: Modelling Procedures, Scientific Tools, Simulation Processing
Authors: Massimiliano Nastri
Abstract:
The study is part of the scientific references on executive design in engineering and architecture, understood as an interdisciplinary field aimed at anticipating and simulating, planning and managing, guiding and instructing construction operations on site. On this basis, the study intends to provide an analysis of a theoretical, methodological, and guiding character aimed at constituting the disciplinary sphere of the executive design, often in the absence of supporting methodological and procedural guidelines in engineering and architecture. The basic methodologies of the study refer to the investigation of the theories and references that can contribute to constituting the scenario of the executive design as the practice of modelling, visualization, and simulation of the construction phases, through the practices of projection of the pragmatic issues of the building. This by proposing a series of references, interrelations, and openings intended to support (for intellectual, procedural, and applicative purposes) the executive definition of the project, aimed at activating the practices of cognitive acquisition and realization intervention within reality.Keywords: modelling and simulation technology, executive design, discretization of the construction, engineering design for building
Procedia PDF Downloads 821549 A Prediction Model Using the Price Cyclicality Function Optimized for Algorithmic Trading in Financial Market
Authors: Cristian Păuna
Abstract:
After the widespread release of electronic trading, automated trading systems have become a significant part of the business intelligence system of any modern financial investment company. An important part of the trades is made completely automatically today by computers using mathematical algorithms. The trading decisions are taken almost instantly by logical models and the orders are sent by low-latency automatic systems. This paper will present a real-time price prediction methodology designed especially for algorithmic trading. Based on the price cyclicality function, the methodology revealed will generate price cyclicality bands to predict the optimal levels for the entries and exits. In order to automate the trading decisions, the cyclicality bands will generate automated trading signals. We have found that the model can be used with good results to predict the changes in market behavior. Using these predictions, the model can automatically adapt the trading signals in real-time to maximize the trading results. The paper will reveal the methodology to optimize and implement this model in automated trading systems. After tests, it is proved that this methodology can be applied with good efficiency in different timeframes. Real trading results will be also displayed and analyzed in order to qualify the methodology and to compare it with other models. As a conclusion, it was found that the price prediction model using the price cyclicality function is a reliable trading methodology for algorithmic trading in the financial market.Keywords: algorithmic trading, automated trading systems, financial markets, high-frequency trading, price prediction
Procedia PDF Downloads 1881548 Discussing Embedded versus Central Machine Learning in Wireless Sensor Networks
Authors: Anne-Lena Kampen, Øivind Kure
Abstract:
Machine learning (ML) can be implemented in Wireless Sensor Networks (WSNs) as a central solution or distributed solution where the ML is embedded in the nodes. Embedding improves privacy and may reduce prediction delay. In addition, the number of transmissions is reduced. However, quality factors such as prediction accuracy, fault detection efficiency and coordinated control of the overall system suffer. Here, we discuss and highlight the trade-offs that should be considered when choosing between embedding and centralized ML, especially for multihop networks. In addition, we present estimations that demonstrate the energy trade-offs between embedded and centralized ML. Although the total network energy consumption is lower with central prediction, it makes the network more prone for partitioning due to the high forwarding load on the one-hop nodes. Moreover, the continuous improvements in the number of operations per joule for embedded devices will move the energy balance toward embedded prediction.Keywords: central machine learning, embedded machine learning, energy consumption, local machine learning, wireless sensor networks, WSN
Procedia PDF Downloads 1591547 Evaluation of a Chitin Synthesis Inhibitor Novaluron in the Shrimp Palaemon Adspersus: Impact on Ecdysteroids and Chitin Contents
Authors: Hinda Berghiche, Hamida Benradia, Noureddine Soltani
Abstract:
Pesticides are widely used in crop production and are known to induce a major contamination of ecosystems especially in aquatic environments. The leaching of a large amount of pollutants derived from agricultural activities (fertilizers, pesticides) might contaminate rivers which diverse into the likes and estuarine and coastal environments affecting several organisms such as crustacean species. In this context, there is searched for new selective insecticides with minimal toxic effects on the environment and human health such as growth insect regulators (GIRs). The current study aimed to examine the impact of novaluron (CE 20%), a potent benzoylphenylurea derivative insecticide on mosquito larvae, against non-target shrimp, Palaemon adspersus (Decapoda, Palaemonidae). The compound was tested at two concentrations (0.91 mg/L and 4.30 mg/L) corresponding respectively to the LC50 and LC90 determined against fourth-instar larvae of Culiseta longiareolata (Diptera, Culicidae). The molting hormone titer was determined in the haemolymph by an enzyme-immunoassay, while chitin was measured in peripheral integument at different stages during the molting cycle. Under normal conditions, the haemolymphatic ecdysteroid concentrations increased during the molting cycle to reach peak at stage D. In the treated series, we note absence of the peak at stage D and an increase at stages B, C and D as compared to the controls. Concerning the chitin amounts, we observe an increase from stage A to stage C followed by a decrease at stage D. Exposition of shrimps to novaluron resulted in a significant decrease of values at all molting stages with a dose-response effect. Thus, the insecticide can present secondary effects on this non-target arthropod species.Keywords: toxicology, novaluron, crustacean, palaemon adspersus, ecdysteroids, cuticle, chitin
Procedia PDF Downloads 2531546 Practices of Lean Manufacturing in the Autoparts: Brazilian Industry Overview
Authors: Guilherme Gorgulho, Carlos Roberto Camello Lima
Abstract:
Over the past five years between 2011 and 2015, the license plate of cars, light commercial vehicles, trucks and buses have suffered retraction. This sector's decline can be explained by economic and national policy in the Brazilian industry operates. In parallel to the reduction of sales and license plate of vehicles, their suppliers are also affected influencing its results, among these vendors, there is the auto parts sector. The existence of international companies, and featured strongly in Asia and Mexico due to low production costs, encourage companies to constantly seek continuous improvement and operational efficiency. Under this argument, the decision making based on lean manufacturing tools it is essential for the management of operations. The purpose of this article is to analyze between lean practices in Brazilian auto parts industries, through the application of a questionnaire with employees who practice lean thinking in organizations. The purpose is to confront the extracted data in the questionnaires, and debate on which of lean tools help organizations as a competitive advantage.Keywords: autoparts, brazilian industry, lean practices, survey
Procedia PDF Downloads 3461545 ROOP: Translating Sequential Code Fragments to Distributed Code Fragments Using Deep Reinforcement Learning
Authors: Arun Sanjel, Greg Speegle
Abstract:
Every second, massive amounts of data are generated, and Data Intensive Scalable Computing (DISC) frameworks have evolved into effective tools for analyzing such massive amounts of data. Since the underlying architecture of these distributed computing platforms is often new to users, building a DISC application can often be time-consuming and prone to errors. The automated conversion of a sequential program to a DISC program will consequently significantly improve productivity. However, synthesizing a user’s intended program from an input specification is complex, with several important applications, such as distributed program synthesizing and code refactoring. Existing works such as Tyro and Casper rely entirely on deductive synthesis techniques or similar program synthesis approaches. Our approach is to develop a data-driven synthesis technique to identify sequential components and translate them to equivalent distributed operations. We emphasize using reinforcement learning and unit testing as feedback mechanisms to achieve our objectives.Keywords: program synthesis, distributed computing, reinforcement learning, unit testing, DISC
Procedia PDF Downloads 1151544 Pawn or Potentates: Corporate Governance Structure in Indian Central Public Sector Enterprises
Authors: Ritika Jain, Rajnish Kumar
Abstract:
The Department of Public Enterprises had made submissions of Self Evaluation Reports, for the purpose of corporate governance, mandatory for all central government owned enterprises. Despite this, an alarming 40% of the enterprises did not do so. This study examines the impact of external policy tools and internal firm-specific factors on corporate governance of central public sector enterprises (CPSEs). We use a dataset of all manufacturing and non-financial services owned by the central government of India for the year 2010-11. Using probit, ordered logit and Heckman’s sample selection models, the study finds that the probability and quality of corporate governance is positively influenced by the CPSE getting into a Memorandum of Understanding (MoU) with the central government of India, and hence, enjoying more autonomy in terms of day to day operations. Besides these, internal factors, including bigger size and lower debt size contribute significantly to better corporate governance.Keywords: corporate governance, central public sector enterprises (CPSEs), sample selection, Memorandum of Understanding (MoU), ordered logit, disinvestment
Procedia PDF Downloads 2601543 Mitigation of High Voltage Equipment Design Deficiencies for Improved Operation and Maintenance
Authors: Riyad Awad, Abdulmohsen Alghadeer, Meshari Otaibi
Abstract:
Proper operation and maintenance (O&M) activities of high voltage equipment can lead to an increased asset lifecycle and maintain its integrity and reliability. Such a vital process is important to be proactively considered during equipment design and manufacturing phases by removing and eliminating any obstacles in the equipment which adversely affect the (O&M) activities. This paper presents a gap analysis pertaining to difficulties in performing operations and maintenance (O&M) high voltage electrical equipment, includes power transformers, switch gears, motor control center, disconnect switches and circuit breakers. The difficulties are gathered from field personnel, equipment design review comments, quality management system, and lessons learned database. The purpose of the gap analysis is to mitigate and prevent the (O&M) difficulties as early as possible in the design stage of the equipment lifecycle. The paper concludes with several recommendations and corrective actions for all identified gaps in order to reduce the cost (O&M) difficulties and improve the equipment lifecycle.Keywords: operation and maintenance, high voltage equipment, equipment lifecycle, reduce the cost of maintenance
Procedia PDF Downloads 1761542 Geoelectrical Investigation Around Bomo Area, Kaduna State, Nigeria
Authors: B. S. Jatau, Baba Adama, S. I. Fadele
Abstract:
Electrical resistivity investigation was carried out around Bomo area, Zaria, Kaduna state in order to study the subsurface geologic layer with a view of determining the depth to the bedrock and thickness of the geologic layers. Vertical Electrical Sounding (VES) using Schlumberger array was carried out at fifteen (15) VES stations. ABEM terrameter (SAS 300) was used for the data acquisition. The field data obtained have been analyzed using computer software (IPI2win) which gives an automatic interpretation of the apparent resistivity. The VES results revealed heterogeneous nature of the subsurface geological sequence. The geologic sequence beneath the study area is composed of hard pan top soil (clayey and sandy-lateritic), weathered layer, partly weathered or fractured basement and fresh basement. The resistivity value for the topsoil layer varies from 40Ωm to 450Ωm with thickness ranging from 1.25 to 7.5 m. The weathered basement has resistivity values ranging from 50Ωm to 593Ωm and thickness between 1.37 and 20.1 m. The fractured basement has resistivity values ranging from 218Ωm to 520Ωm and thickness of between 12.9 and 26.3 m. The fresh basement (bedrock) has resistivity values ranging from 1215Ωm to 2150Ωm with infinite depth. However, the depth of the earth’s surface to the bedrock surface varies between 2.63 and 34.99 m. The study further stressed the importance of the findings in civil engineering structures and groundwater prospecting.Keywords: electrical resistivity, CERT (CT), vertical electrical sounding (VES), top soil (TP), weathered basement (WB), partly weathered basement (PWB), fresh basement (FB)
Procedia PDF Downloads 3331541 Closing the Gap: Efficient Voxelization with Equidistant Scanlines and Gap Detection
Authors: S. Delgado, C. Cerrada, R. S. Gómez
Abstract:
This research introduces an approach to voxelizing the surfaces of triangular meshes with efficiency and accuracy. Our method leverages parallel equidistant scan-lines and introduces a Gap Detection technique to address the limitations of existing approaches. We present a comprehensive study showcasing the method's effectiveness, scalability, and versatility in different scenarios. Voxelization is a fundamental process in computer graphics and simulations, playing a pivotal role in applications ranging from scientific visualization to virtual reality. Our algorithm focuses on enhancing the voxelization process, especially for complex models and high resolutions. One of the major challenges in voxelization in the Graphics Processing Unit (GPU) is the high cost of discovering the same voxels multiple times. These repeated voxels incur in costly memory operations with no useful information. Our scan-line-based method ensures that each voxel is detected exactly once when processing the triangle, enhancing performance without compromising the quality of the voxelization. The heart of our approach lies in the use of parallel, equidistant scan-lines to traverse the interiors of triangles. This minimizes redundant memory operations and avoids revisiting the same voxels, resulting in a significant performance boost. Moreover, our method's computational efficiency is complemented by its simplicity and portability. Written as a single compute shader in Graphics Library Shader Language (GLSL), it is highly adaptable to various rendering pipelines and hardware configurations. To validate our method, we conducted extensive experiments on a diverse set of models from the Stanford repository. Our results demonstrate not only the algorithm's efficiency, but also its ability to produce 26 tunnel free accurate voxelizations. The Gap Detection technique successfully identifies and addresses gaps, ensuring consistent and visually pleasing voxelized surfaces. Furthermore, we introduce the Slope Consistency Value metric, quantifying the alignment of each triangle with its primary axis. This metric provides insights into the impact of triangle orientation on scan-line based voxelization methods. It also aids in understanding how the Gap Detection technique effectively improves results by targeting specific areas where simple scan-line-based methods might fail. Our research contributes to the field of voxelization by offering a robust and efficient approach that overcomes the limitations of existing methods. The Gap Detection technique fills a critical gap in the voxelization process. By addressing these gaps, our algorithm enhances the visual quality and accuracy of voxelized models, making it valuable for a wide range of applications. In conclusion, "Closing the Gap: Efficient Voxelization with Equidistant Scan-lines and Gap Detection" presents an effective solution to the challenges of voxelization. Our research combines computational efficiency, accuracy, and innovative techniques to elevate the quality of voxelized surfaces. With its adaptable nature and valuable innovations, this technique could have a positive influence on computer graphics and visualization.Keywords: voxelization, GPU acceleration, computer graphics, compute shaders
Procedia PDF Downloads 751540 A Review of Recent Studies on Advanced Technologies for Water Treatment
Authors: Deniz Sahin
Abstract:
Growing concern for the presence and contamination of heavy metals in our water supplies has steadily increased over the last few years. A number of specialized technologies including precipitation, coagulation/flocculation, ion exchange, cementation, electrochemical operations, have been developed for the removal of heavy metals from wastewater. However, these technologies have many limitations in the application, such as high cost, low separation efficiency, Recently, numerous approaches have been investigated to overcome these difficulties and membrane filtration, advanced oxidation technologies (AOPs), and UV irradiation etc. are sufficiently developed to be considered as alternative treatments. Many factors come into play when selecting wastewater treatment technology, such as type of wastewater, operating conditions, economics etc. This study describes these various treatment technologies employed for heavy metal removal. Advantages and disadvantages of these technologies are also compared to highlight their current limitations and future research needs. For example, we investigated the applicability of the ultrafiltration technology for treating of heavy metal ions (e.g., Cu(II), Pb(II), Cd(II), Zn(II)) from synthetic wastewater solutions. Results shown that complete removal of metal ions, could be achieved.Keywords: heavy metal, treatment methodologies, water, water treatment
Procedia PDF Downloads 1731539 Lessons Learned from Ransomware-as-a-Service (RaaS) Organized Campaigns
Authors: Vitali Kremez
Abstract:
The researcher monitored an organized ransomware campaign in order to gain significant visibility into the tactics, techniques, and procedures employed by a campaign boss operating a ransomware scheme out of Russia. As the Russian hacking community lowered the access requirements for unsophisticated Russian cybercriminals to engage in ransomware campaigns, corporations and individuals face a commensurately greater challenge of effectively protecting their data and operations from being held ransom. This report discusses two notorious ransomware campaigns. Though the loss of data can be devastating, the findings demonstrate that sending ransom payments does not always help obtain data. Key learnings: 1. From the ransomware affiliate perspective, such campaigns have significantly lowered the barriers for entry for low-tier cybercriminals. 2. Ransomware revenue amounts are not as glamorous and fruitful as they are often publicly reported. Average ransomware crime bosses make only $90K per year on average. 3. Data gathered indicates that sending ransom payments does not always help obtain data. 4. The talk provides the complete payout structure and Bitcoin laundering operation related to the ransomware-as-a-service campaign.Keywords: bitcoin, cybercrime, ransomware, Russia
Procedia PDF Downloads 1991538 Managing the Cosmos: Problems, Solutions, and Future Insights into Space Debris
Authors: Irfan Nazir Wani, Pushpendra Kumar Shukla, Manoj Kumar
Abstract:
Debris, also called waste or junk, present in orbit of Earth or orbital debris, offers a substantial challenge to space exploration. Satellite operations and other space-based activities. This research paper delves into the causes and effects of space debris accumulation, explores current mitigation techniques, and presents a hopeful outlook on the potential for future sustainable space activities. The paper emphasizes the necessity of addressing planetary fragments to ensure durable sustainability in universe exploration and utilization. It examines various strategies for mitigating space debris, including debris removal technologies, spacecraft design improvements, and international collaboration efforts. Additionally, the paper highlights the importance of space debris monitoring and tracking systems in preventing collisions and minimizing the growth of orbital debris. By comprehending the complexities of space debris and implementing effective mitigation measures, the space industry can work towards a future where sustainable space activities are achievable.Keywords: space shuttle, debris, space junk, satellite, fragments, orbit
Procedia PDF Downloads 531537 Phytoplankton Assemblage and Physicochemical Parameters of a Perturbed Tropical Manmade Lake, Southwestern Nigeria
Authors: Adedolapo Ayoade, John the Beloved Dada
Abstract:
This study identified the phytoplankton assemblage of the Dandaru Lake (that received effluents from a zoological garden and hospital) as bioindicators of water quality. Physicochemical parameters including Dissolved Oxygen (DO), biochemical oxygen demand, nitrate, phosphate and heavy metals were also determined. Samples of water and plankton were collected once monthly from April to September, 2015 at five stations (I – V). The mean physicochemical parameters were within the limits of National Environmental Standards and Regulations Enforcement Agency (NESREA) and USEPA except Lead, 0.02 ± 0.08 mg/ L; Manganese, 0.46 ± 1.00 mg/ L and Zinc, 0.05 ± 0.17 mg/ L. Means of DO, alkalinity, and phosphate were significantly different between the stations at p < 0.05. While highest mean DO (6.88 ± 1.34 mg/L) was recorded in station I with less anthropogenic activities, highest phosphate concentration (0.28 ± 0.28 mg/L) occurred in station II, the entry point of wastewater from hospital and zoological garden. The 147 phytoplankton species found in the lake belonged to six classes: Chlorophyceae (50), Euglenophyceae (40), Bacillariophyceae (37), Cyanophyceae (17), Xanthophyceae and Chrysophyceae (3). The order of abundance for phytoplankton was Euglenophyceae (49.77%) > Bacillariophyceae (18.00%) > Cyanophyceae (17.39%) > Chlorophyceae (13.7%) > Xanthophyceae (1.06%) > Chrysophyceae (0.02%). The stations impacted with effluents were dominated by members of Euglenophyceae (Station III, 77.09%; IV, 50.55%) and Cyanophyceae (Station II, 27.7%; V, 32.57%). While station I was dominated by diatoms (57.98%). The species richness recorded was 0.32 – 4.49. Evenness index was highest in station I and least in station III. Generally, pollution tolerant species (Microcystis, Oscillatoria, Scenedesmus, Anabaena, and Euglena) showed greater density in areas impacted by human activities. The phytoplankton assemblage and comparatively low biotic diversity in Dandaru Lake could be attributed to perturbations in the water column that exerted selective effects on the biological assemblage.Keywords: manmade lake, Nigeria, phytoplankton, water quality
Procedia PDF Downloads 264