Search results for: smart software tools
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9204

Search results for: smart software tools

7554 In silico Analysis of Isoniazid Resistance in Mycobacterium tuberculosis

Authors: A. Nusrath Unissa, Sameer Hassan, Luke Elizabeth Hanna

Abstract:

Altered drug binding may be an important factor in isoniazid (INH) resistance, rather than major changes in the enzyme’s activity as a catalase or peroxidase (KatG). The identification of structural or functional defects in the mutant KatGs responsible for INH resistance remains as an area to be explored. In this connection, the differences in the binding affinity between wild-type (WT) and mutants of KatG were investigated, through the generation of three mutants of KatG, Ser315Thr [S315T], Ser315Asn [S315N], Ser315Arg [S315R] and a WT [S315]) with the help of software-MODELLER. The mutants were docked with INH using the software-GOLD. The affinity is lower for WT than mutant, suggesting the tight binding of INH with the mutant protein compared to WT type. These models provide the in silico evidence for the binding interaction of KatG with INH and implicate the basis for rationalization of INH resistance in naturally occurring KatG mutant strains of Mycobacterium tuberculosis.

Keywords: Mycobacterium tuberculosis, KatG, INH resistance, mutants, modelling, docking

Procedia PDF Downloads 304
7553 Risk Analysis in Off-Site Construction Manufacturing in Small to Medium-Sized Projects

Authors: Atousa Khodadadyan, Ali Rostami

Abstract:

The objective of off-site construction manufacturing is to utilise the workforce and machinery in a controlled environment without external interference for higher productivity and quality. The usage of prefabricated components can save up to 14% of the total energy consumption in comparison with the equivalent number of cast-in-place ones. Despite the benefits of prefabrication construction, its current project practices encompass technical and managerial issues. Building design, precast components’ production, logistics, and prefabrication installation processes are still mostly discontinued and fragmented. Furthermore, collaboration among prefabrication manufacturers, transportation parties, and on-site assemblers rely on real-time information such as the status of precast components, delivery progress, and the location of components. From the technical point of view, in this industry, geometric variability is still prevalent, which can be caused during the transportation or production of components. These issues indicate that there are still many aspects of prefabricated construction that can be developed using disruptive technologies. Practical real-time risk analysis can be used to address these issues as well as the management of safety, quality, and construction environment issues. On the other hand, the lack of research about risk assessment and the absence of standards and tools hinder risk management modeling in prefabricated construction. It is essential to note that no risk management standard has been established explicitly for prefabricated construction projects, and most software packages do not provide tailor-made functions for this type of projects.

Keywords: project risk management, risk analysis, risk modelling, prefabricated construction projects

Procedia PDF Downloads 155
7552 Protection of Cultural Heritage against the Effects of Climate Change Using Autonomous Aerial Systems Combined with Automated Decision Support

Authors: Artur Krukowski, Emmanouela Vogiatzaki

Abstract:

The article presents an ongoing work in research projects such as SCAN4RECO or ARCH, both funded by the European Commission under Horizon 2020 program. The former one concerns multimodal and multispectral scanning of Cultural Heritage assets for their digitization and conservation via spatiotemporal reconstruction and 3D printing, while the latter one aims to better preserve areas of cultural heritage from hazards and risks. It co-creates tools that would help pilot cities to save cultural heritage from the effects of climate change. It develops a disaster risk management framework for assessing and improving the resilience of historic areas to climate change and natural hazards. Tools and methodologies are designed for local authorities and practitioners, urban population, as well as national and international expert communities, aiding authorities in knowledge-aware decision making. In this article we focus on 3D modelling of object geometry using primarily photogrammetric methods to achieve very high model accuracy using consumer types of devices, attractive both to professions and hobbyists alike.

Keywords: 3D modelling, UAS, cultural heritage, preservation

Procedia PDF Downloads 107
7551 An Automated Approach to the Nozzle Configuration of Polycrystalline Diamond Compact Drill Bits for Effective Cuttings Removal

Authors: R. Suresh, Pavan Kumar Nimmagadda, Ming Zo Tan, Shane Hart, Sharp Ugwuocha

Abstract:

Polycrystalline diamond compact (PDC) drill bits are extensively used in the oil and gas industry as well as the mining industry. Industry engineers continually improve upon PDC drill bit designs and hydraulic conditions. Optimized injection nozzles play a key role in improving the drilling performance and efficiency of these ever changing PDC drill bits. In the first part of this study, computational fluid dynamics (CFD) modelling is performed to investigate the hydrodynamic characteristics of drilling fluid flow around the PDC drill bit. An Open-source CFD software – OpenFOAM simulates the flow around the drill bit, based on the field input data. A specifically developed console application integrates the entire CFD process including, domain extraction, meshing, and solving governing equations and post-processing. The results from the OpenFOAM solver are then compared with that of the ANSYS Fluent software. The data from both software programs agree. The second part of the paper describes the parametric study of the PDC drill bit nozzle to determine the effect of parameters such as number of nozzles, nozzle velocity, nozzle radial position and orientations on the flow field characteristics and bit washing patterns. After analyzing a series of nozzle configurations, the best configuration is identified and recommendations are made for modifying the PDC bit design.

Keywords: ANSYS Fluent, computational fluid dynamics, nozzle configuration, OpenFOAM, PDC dill bit

Procedia PDF Downloads 401
7550 Design and Implementation of Smart Watch Textile Antenna for Wi-Fi Bio-Medical Applications in Millimetric Wave Band

Authors: M. G. Ghanem, A. M. M. A. Allam, Diaa E. Fawzy, Mehmet Faruk Cengiz

Abstract:

This paper is devoted to the design and implementation of a smartwatch textile antenna for Wi-Fi bio-medical applications in millimetric wave bands. The antenna is implemented on a leather textile-based substrate to be embedded in a smartwatch. It enables the watch to pick Wi-Fi signals without the need to be connected to a mobile through Bluetooth. It operates at 60 GHz or WiGig (Wireless Gigabit Alliance) band with a wide band for higher rate applications. It also could be implemented over many stratified layers of the body organisms to be used in the diagnosis of many diseases like diabetes and cancer. The structure is designed and simulated using CST (Studio Suite) program. The wearable patch antenna has an octagon shape, and it is implemented on leather material that acts as a flexible substrate with a size of 5.632 x 6.4 x 2 mm3, a relative permittivity of 2.95, and a loss tangent of 0.006. The feeding is carried out using differential feed (discrete port in CST). The work provides five antenna implementations; antenna without ground, a ground is added at the back of the antenna in order to increase the antenna gain, the substrate dimensions are increased to 15 x 30 mm2 to resemble the real hand watch size, layers of skin and fat are added under the ground of the antenna to study the effect of human body tissues human on the antenna performance. Finally, the whole structure is bent. It is found that the antenna can achieve a simulated peak realized gain in dB of 5.68, 7.28, 6.15, 3.03, and 4.37 for antenna without ground, antenna with the ground, antenna with larger substrate dimensions, antenna with skin and fat, and bent structure, respectively. The antenna with ground exhibits high gain; while adding the human organisms absorption, the gain is degraded because of human absorption. The bent structure contributes to higher gain.

Keywords: bio medical engineering, millimetric wave, smart watch, textile antennas, Wi-Fi

Procedia PDF Downloads 97
7549 Application of Lean Manufacturing Tools in Hot Asphalt Production

Authors: S. Bayona, J. Nunez, D. Paez, C. Diaz

Abstract:

The application of Lean manufacturing tools continues to be an effective solution for increasing productivity, reducing costs and eliminating waste in the manufacture of goods and services. This article analyzes the production process of a hot asphalt manufacturing company from an administrative and technical perspective. Three main phases were analyzed, the first phase was related to the determination of the risk priority number of the main operations in asphalt mix production process by an FMEA (Failure Mode Effects Analysis), in the second phase the Value Stream Mapping (VSM) of the production line was performed and in the third phase a SWOT (Strengths, Weaknesses Opportunities, Threats) matrix was constructed. Among the most valued failure modes were the lack training of workers in occupational safety and health issues, the lack of signaling and classification of granulated material, and the overweight of vehicles loaded. The analysis of the results in the three phases agree on the importance of training operational workers, improve communication with external actors in order to minimize delays in material orders and strengthen control suppliers.

Keywords: asphalt, lean manufacturing, productivity, process

Procedia PDF Downloads 101
7548 Profile of Cross-Reactivity Allergens Highlighted by Multiplex Technology “Alex Microchip Technique” in the Diagnosis of Type I Hypersensitivity

Authors: Gadiri Sabiha

Abstract:

Introduction: Current allergy diagnostic tools using Multiplex technology have made it possible to increase the efficiency of the search for specific IgE. This opportunity is provided by the newly developed “Alex Biochip”, consisting of a panel of 282 allergens in native and molecular form, a CCD inhibitor, and the potential for detecting cross-reactive allergens. We evaluated the performance of this technology in detecting cross-reactivity in previously explored patients. Material/Method: The sera of 39 patients presenting sensitization and polysensitization profiles were explored. The search for specific IgE is carried out by the Alex ® IgE Biochip, and the results are analyzed by nature and by molecular family of allergens using specific software. Results/Discussion: The analysis gave a particular profile of cross-reactivity allergens: 33% for the Ole e1 family, 31% for NPC2, 26% for storage proteins, 20% for Tropomyosin, 10% for LTPs, 10% for Arginine Kinase and 10% for Uteroglobin CCDs were absent in all patients. The “Ole e1” allergen is responsible for a pollen-pollen cross allergy. The storage proteins found and LTP are not species-specific, causing cross-pollen-food allergy. The nDer p2 of the NPC2 family is responsible for cross-reactivity between mite species. Conclusion: The cross-reactivities responsible for mixed syndromes at diagnosis in our patients were dominated by pollen-pollen and pollen-food syndromes. They allow the identification of severity factors linked to the prognosis and the best-adapted immunotherapy.

Keywords: specific IgE, allergy, cross reactivity, molecular allergens

Procedia PDF Downloads 52
7547 Comparison of the Thermal Behavior of Different Crystal Forms of Manganese(II) Oxalate

Authors: B. Donkova, M. Nedyalkova, D. Mehandjiev

Abstract:

Sparingly soluble manganese oxalate is an appropriate precursor for the preparation of nanosized manganese oxides, which have a wide range of technological application. During the precipitation of manganese oxalate, three crystal forms could be obtained – α-MnC₂O₄.2H₂O (SG C2/c), γ-MnC₂O₄.2H₂O (SG P212121) and orthorhombic MnC₂O₄.3H₂O (SG Pcca). The thermolysis of α-MnC₂O₄.2H₂O has been extensively studied during the years, while the literature data for the other two forms has been quite scarce. The aim of the present communication is to highlight the influence of the initial crystal structure on the decomposition mechanism of these three forms, their magnetic properties, the structure of the anhydrous oxalates, as well as the nature of the obtained oxides. For the characterization of the samples XRD, SEM, DTA, TG, DSC, nitrogen adsorption, and in situ magnetic measurements were used. The dehydration proceeds in one step with α-MnC₂O₄.2H2O and γ-MnC₂O₄.2H₂O, and in three steps with MnC₂O₄.3H2O. The values of dehydration enthalpy are 97, 149 and 132 kJ/mol, respectively, and the last two were reported for the first time, best to our knowledge. The magnetic measurements show that at room temperature all samples are antiferomagnetic, however during the dehydration of α-MnC₂O₄.2H₂O the exchange interaction is preserved, for MnC₂O₄.3H₂O it changes to ferromagnetic above 35°C, and for γ-MnC₂O₄.2H₂O it changes twice from antiferomagnetic to ferromagnetic above 70°C. The experimental results for magnetic properties are in accordance with the computational results obtained with Wien2k code. The difference in the initial crystal structure of the forms used determines different changes in the specific surface area during dehydration and different extent of Mn(II) oxidation during decomposition in the air; both being highest at α-MnC₂O₄.2H₂O. The isothermal decomposition of the different oxalate forms shows that the type and physicochemical properties of the oxides, obtained at the same annealing temperature depend on the precursor used. Based on the results from the non-isothermal and isothermal experiments, and from different methods used for characterization of the sample, a comparison of the nature, mechanism and peculiarities of the thermolysis of the different crystal forms of manganese oxalate was made, which clearly reveals the influence of the initial crystal structure. Acknowledgment: 'Science and Education for Smart Growth', project BG05M2OP001-2.009-0028, COST Action MP1306 'Modern Tools for Spectroscopy on Advanced Materials', and project DCOST-01/18 (Bulgarian Science Fund).

Keywords: crystal structure, magnetic properties, manganese oxalate, thermal behavior

Procedia PDF Downloads 155
7546 Linking Business Process Models and System Models Based on Business Process Modelling

Authors: Faisal A. Aburub

Abstract:

Organizations today need to invest in software in order to run their businesses, and to the organizations’ objectives, the software should be in line with the business process. This research presents an approach for linking process models and system models. Particularly, the new approach aims to synthesize sequence diagram based on role activity diagram (RAD) model. The approach includes four steps namely: Create business process model using RAD, identify computerized activities, identify entities in sequence diagram and identify messages in sequence diagram. The new approach has been validated using the process of student registration in University of Petra as a case study. Further research is required to validate the new approach using different domains.

Keywords: business process modelling, system models, role activity diagrams, sequence diagrams

Procedia PDF Downloads 364
7545 Gender and Older People: Reframing Gender Analysis through Lifecycle Lens

Authors: Supriya Akerkar

Abstract:

The UN Decade on Healthy Ageing (2021-2030) provides a new opportunity to address ageing and gender issues in different societies. The concept of gender has been used to unpack and analyse the power and constructions of gender relations in different societies. Such analysis has been employed and used to inform policy and practices of governments and non-governmental organisations to further gender equalities in their work. Yet, experiences of older women and men are often left out of such mainstream gender analysis, marginalising their existence and issues. This paper argues that new critical analytical tools are needed to capture the realities and issues of interest to older women and men. In particular, it argues that gender analysis needs to integrate analytical concepts of ageing and lifecycle approach in its framework. The paper develops such a framework by critical interrogation of the gender analysis tools that are currently applied for framing gender issues in international development and humanitarian work. Informed by the realities and experiences of older women and men, developed through a synthesis of available literature, the paper will develop a new framework for gender analysis that can be used by governments and non-government organisations in their work to further gender justice across the life cycle.

Keywords: ageing, gender, older people, social inclusion

Procedia PDF Downloads 221
7544 Personal Knowledge Management: Systematic Review and Future Direction

Authors: Kuribachew Gizaw Tohiye, Monica Garfield

Abstract:

Personal knowledge management is the aspect of knowledge management that relates to the way in which individuals organize and manage their own set of knowledge. While in that respect, there has been research in this area for the past 25 years, it is at present necessary to speculate upon what research has been done and what we have discovered about this arena of knowledge management. In contrast to organizational knowledge management, which focuses on a firm’s profitability and competitiveness, personal knowledge management (PKM) is concerned with the person’s self-effectiveness, competence and success. People are concerned in managing their knowledge in order to become more efficient in a variety of personal and organizational interests. This study presents a systematic review of PKM studies. Articles with PKM concepts are reviewed with the objective of clearly defining PKM, identifying the benefits of PKM, classifying the tools that enable PKM and finding the research gaps to indicate future research directions in the area. Consequently, we have developed a definition of PKM and identified the benefits of PKM, including an understanding of who seeks PKM and for what. Tools enabling PKM are identified and classified under three categories Web 1.0, 2.0 and 3.0 and finally the research gap and future directions are suggested. Research which facilitates collaboration by using semantic technologies is suggested to be studied further to improve PKM effectiveness.

Keywords: personal knowledge management, knowledge management, organizational knowledge management, systematic review

Procedia PDF Downloads 307
7543 Effectiveness of Climate Smart Agriculture in Managing Field Stresses in Robusta Coffee

Authors: Andrew Kirabira

Abstract:

This study is an investigation into the effectiveness of climate-smart agriculture (CSA) technologies in improving productivity through managing biotic and abiotic stresses in the coffee agroecological zones of Uganda. The motive is to enhance farmer livelihoods. The study was initiated as a result of the decreasing productivity of the crop in Uganda caused by the increasing prevalence of pests, diseases and abiotic stresses. Despite 9 years of farmers’ application of CSA, productivity has stagnated between 700kg -800kg/ha/yr which is only 26% of the 3-5tn/ha/yr that CSA is capable of delivering if properly applied. This has negatively affected the incomes of the 10.6 million people along the crop value chain which has in essence affected the country’s national income. In 2019/20 FY for example, Uganda suffered a deficit of $40m out of singularly the increasing incidence of one pest; BCTB. The amalgamation of such trends cripples the realization of SDG #1 and #13 which are the eradication of poverty and mitigation of climate change, respectively. In probing CSA’s effectiveness in curbing such a trend, this study is guided by the objectives of; determining the existing farmers’ knowledge and perceptions of CSA amongst the coffee farmers in the diverse coffee agro-ecological zones of Uganda; examining the relationship between the use of CSA and prevalence of selected coffee pests, diseases and abiotic stresses; ascertaining the difference in the market organization and pricing between conventionally and CSA produced coffee; and analyzing the prevailing policy environment concerning the use of CSA in coffee production. The data collection research design is descriptive in nature; collecting data from farmers and agricultural extension workers in the districts of Ntungamo, Iganga and Luweero; each of these districts representing a distinct coffee agroecological zone. Policy custodian officers at district, cooperatives and at the crop’s overseeing national authority were also interviewed.

Keywords: climate change, food security, field stresses, Productivity

Procedia PDF Downloads 41
7542 The Significant Effect of Wudu’ and Zikr in the Controlling of Emotional Pressure Using Biofeedback Emwave Technique

Authors: Mohd Anuar Awang Idris, Muhammad Nubli Abdul Wahab, Nora Yusma Mohamed Yusoff

Abstract:

Wudu’ (Ablution) and Zikr are amongst some of the spiritual tools which may help an individual control his mind, emotion and attitude. These tools are deemed to be able to deliver a positive impact on an individual’s psychophysiology. The main objective of this research is to determine the effects of Wudu’ (Ablution) and Zikr therapy using the biofeedback emWave application and technology. For this research, 13 students were selected as samples from the students’ representative body at the University Tenaga National, Malaysia. The DASS (Depression Anxiety Stress Scale) questionnaire was used to help with the assessment and measurement of each student’s ability in controlling his or her emotions before and after the therapies. The biofeedback emWave technology was utilized to monitor the student’s psychophysiology level. In addition, the data obtained from the Heart rate variability (HRV) test have also been used to affirm that Wudu’ and Zikr had had significant impacts on the student’s success in controlling his or her emotional pressure.

Keywords: biofeedback EmWave, emotion, psychophysiology, wudu’, zikr

Procedia PDF Downloads 185
7541 Analysis of a Faience Enema Found in the Assasif Tomb No. -28- of the Vizier Amenhotep Huy: Contributions to the Study of the Mummification Ritual Practiced in the Theban Necropolis

Authors: Alberto Abello Moreno-Cid

Abstract:

Mummification was the process through which immortality was granted to the deceased, so it was of extreme importance to the Egyptians. The techniques of embalming had evolved over the centuries, and specialists created increasingly sophisticated tools. However, due to its eminently religious nature, knowledge about everything related to this practice was jealously preserved, and the testimonies that have survived to our time are scarce. For this reason, embalming instruments found in archaeological excavations are uncommon. The tomb of the Vizier Amenhotep Huy (AT No. -28-), located in the el-Assasif necropolis that is being excavated since 2009 by the team of the Institute of Ancient Egyptian Studies, has been the scene of some discoveries of this type that evidences the existence of mummification practices in this place after the New Kingdom. The clysters or enemas are the fundamental tools in the second type of mummification described by the historian Herodotus to introduce caustic solutions inside the body of the deceased. Nevertheless, such objects only have been found in three locations: the tomb of Ankh-Hor in Luxor, where a copper enema belonged to the prophet of Ammon Uah-ib-Ra came to light; the excavation of the tomb of Menekh-ib-Nekau in Abusir, where was also found one made of copper; and the excavations in the Bucheum, where two more artifacts were discovered, also made of copper but in different shapes and sizes. Both of them were used for the mummification of sacred animals and this is the reason they vary significantly. Therefore, the object found in the tomb No. -28-, is the first known made of faience of all these peculiar tools and the oldest known until now, dated in the Third Intermediate Period (circa 1070-650 B.C.). This paper bases its investigation on the study of those parallelisms, the material, the current archaeological context and the full analysis and reconstruction of the object in question. The key point is the use of faience in the production of this item: creating a device intended to be in constant use seems to be a first illogical compared to other samples made of copper. Faience around the area of Deir el-Bahari had a strong religious component, associated with solar myths and principles of the resurrection, connected to the Osirian that characterises the mummification procedure. The study allows to refute some of the premises which are held unalterable in Egyptology, verifying the utilization of these sort of pieces, understanding its way of use and showing that this type of mummification was also applied to the highest social stratum, in which case the tools were thought out of an exceptional quality and religious symbolism.

Keywords: clyster, el-Assasif, embalming, faience enema mummification, Theban necropolis

Procedia PDF Downloads 93
7540 AI-Assisted Business Chinese Writing: Comparing the Textual Performances Between Independent Writing and Collaborative Writing

Authors: Stephanie Liu Lu

Abstract:

With the proliferation of artificial intelligence tools in the field of education, it is crucial to explore their impact on language learning outcomes. This paper examines the use of AI tools, such as ChatGPT, in practical writing within business Chinese teaching to investigate how AI can enhance practical writing skills and teaching effectiveness. The study involved third and fourth-year university students majoring in accounting and finance from a university in Hong Kong within the context of a business correspondence writing class. Students were randomly assigned to a control group, who completed business letter writing independently, and an experimental group, who completed the writing with the assistance of AI. In the latter, the AI-assisted business letters were initially drafted by the students issuing commands and interacting with the AI tool, followed by the students' revisions of the draft. The paper assesses the performance of both groups in terms of grammatical expression, communicative effect, and situational awareness. Additionally, the study collected dialogue texts from interactions between students and the AI tool to explore factors that affect text generation and the potential impact of AI on enhancing students' communicative and identity awareness. By collecting and comparing textual performances, it was found that students assisted by AI showed better situational awareness, as well as more skilled organization and grammar. However, the research also revealed that AI-generated articles frequently lacked a proper balance of identity and writing purpose due to limitations in students' communicative awareness and expression during the instruction and interaction process. Furthermore, the revision of drafts also tested the students' linguistic foundation, logical thinking abilities, and practical workplace experience. Therefore, integrating AI tools and related teaching into the curriculum is key to the future of business Chinese teaching.

Keywords: AI-assistance, business Chinese, textual analysis, language education

Procedia PDF Downloads 41
7539 Performance Evaluation of Production Schedules Based on Process Mining

Authors: Kwan Hee Han

Abstract:

External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.

Keywords: data mining, event log, process mining, production scheduling

Procedia PDF Downloads 263
7538 Integration of Big Data to Predict Transportation for Smart Cities

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system.  The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.

Keywords: big data, machine learning, smart city, social cost, transportation network

Procedia PDF Downloads 238
7537 Design and Implementation Wireless System by Using Microcontrollers.Application for Drive Acquisition System with Multiple Sensors

Authors: H. Fekhar

Abstract:

Design and implementation acquisition system using radio frequency (RF) ASK module and micro controllers PIC is proposed in this work. The paper includes hardware and software design. The design tools are divided into two units , namely the sender MCU and receiver.The system was designed to measure temperatures of two furnaces and pressure pneumatic process. The wireless transmitter unit use the 433.95 MHz band directly interfaced to micro controller PIC18F4620. The sender unit consists of temperatures-pressure sensors , conditioning circuits , keypad GLCD display and RF module.Signal conditioner converts the output of the sensors into an electric quantity suitable for operation of the display and recording system.The measurements circuits are connected directly to 10 bits multiplexed A/D converter.The graphic liquid crystal display (GLCD) is used . The receiver (RF) module connected to a second microcontroller ,receive the signal via RF receiver , decode the Address/data and reproduces the original data . The strategy adopted for establishing communication between the sender MCU and receiver uses the specific protocol “Header, Address and data”.The communication protocol dealing with transmission and reception have been successfully implemented . Some experimental results are provided to demonstrate the effectiveness of the proposed wireless system. This embedded system track temperatures – pressure signal reasonably well with a small error.

Keywords: microcontrollers, sensors, graphic liquid cristal display, protocol, temperature, pressure

Procedia PDF Downloads 444
7536 Improving Fingerprinting-Based Localization System Using Generative AI

Authors: Getaneh Berie Tarekegn

Abstract:

A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. It also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 0.39 m, and more than 90% of the errors are less than 0.82 m. According to numerical results, SRCLoc improves positioning performance and reduces radio map construction costs significantly compared to traditional methods.

Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine

Procedia PDF Downloads 34
7535 Normalized P-Laplacian: From Stochastic Game to Image Processing

Authors: Abderrahim Elmoataz

Abstract:

More and more contemporary applications involve data in the form of functions defined on irregular and topologically complicated domains (images, meshs, points clouds, networks, etc). Such data are not organized as familiar digital signals and images sampled on regular lattices. However, they can be conveniently represented as graphs where each vertex represents measured data and each edge represents a relationship (connectivity or certain affinities or interaction) between two vertices. Processing and analyzing these types of data is a major challenge for both image and machine learning communities. Hence, it is very important to transfer to graphs and networks many of the mathematical tools which were initially developed on usual Euclidean spaces and proven to be efficient for many inverse problems and applications dealing with usual image and signal domains. Historically, the main tools for the study of graphs or networks come from combinatorial and graph theory. In recent years there has been an increasing interest in the investigation of one of the major mathematical tools for signal and image analysis, which are Partial Differential Equations (PDEs) variational methods on graphs. The normalized p-laplacian operator has been recently introduced to model a stochastic game called tug-of-war-game with noise. Part interest of this class of operators arises from the fact that it includes, as particular case, the infinity Laplacian, the mean curvature operator and the traditionnal Laplacian operators which was extensiveley used to models and to solve problems in image processing. The purpose of this paper is to introduce and to study a new class of normalized p-Laplacian on graphs. The introduction is based on the extension of p-harmonious function introduced in as discrete approximation for both infinity Laplacian and p-Laplacian equations. Finally, we propose to use these operators as a framework for solving many inverse problems in image processing.

Keywords: normalized p-laplacian, image processing, stochastic game, inverse problems

Procedia PDF Downloads 493
7534 Development of a Sustainable Municipal Solid Waste Management for an Urban Area: Case Study from a Developing Country

Authors: Anil Kumar Gupta, Dronadula Venkata Sai Praneeth, Brajesh Dubey, Arundhuti Devi, Suravi Kalita, Khanindra Sharma

Abstract:

Increase in urbanization and industrialization have led to improve in the standard of living. However, at the same time, the challenges due to improper solid waste management are also increasing. Municipal Solid Waste management is considered as a vital step in the development of urban infrastructure. The present study focuses on developing a solid waste management plan for an urban area in a developing country. The current scenario of solid waste management practices at various urban bodies in India is summarized. Guwahati city in the northeastern part of the country and is also one of the targeted smart cities (under the governments Smart Cities program) was chosen as case study to develop and implement the solid waste management plan. The whole city was divided into various divisions and waste samples were collected according to American Society for Testing and Materials (ASTM) - D5231-92 - 2016 for each division in the city and a composite sample prepared to represent the waste from the entire city. The solid waste characterization in terms of physical and chemical which includes mainly proximate and ultimate analysis were carried out. Existing primary and secondary collection systems were studied and possibilities of enhancing the collection systems were discussed. The composition of solid waste for the overall city was found to be as: organic matters 38%, plastic 27%, paper + cardboard 15%, Textile 9%, inert 7% and others 4%. During the conference presentation, further characterization results in terms of Thermal gravimetric analysis (TGA), pH and water holding capacity will be discussed. The waste management options optimizing activities such as recycling, recovery, reuse and reduce will be presented and discussed.

Keywords: proximate, recycling, thermal gravimetric analysis (TGA), solid waste management

Procedia PDF Downloads 167
7533 Methods Used to Perform Requirements Elicitation for FinTech Application Development

Authors: Zhao Pengcheng, Yin Siyuan

Abstract:

Fintech is the new hot topic of the 21st century, a discipline that combines financial theory with computer modelling. It can provide both digital analysis methods for investment banks and investment decisions for users. Given the variety of services available, it is necessary to provide a superior method of requirements elicitation to ensure that users' needs are addressed in the software development process. The accuracy of traditional software requirements elicitation methods is not sufficient, so this study attempts to use a multi-perspective based requirements heuristic framework. Methods such as interview and questionnaire combination, card sorting, and model driven are proposed. The collection results from PCA show that the new methods can better help with requirements elicitation. However, the method has some limitations and, there are some efficiency issues. However, the research in this paper provides a good theoretical extension that can provide researchers with some new research methods and perspectives viewpoints.

Keywords: requirement elicitation, FinTech, mobile application, survey, interview, model-driven

Procedia PDF Downloads 90
7532 Optimization of Lean Methodologies in the Textile Industry Using Design of Experiments

Authors: Ahmad Yame, Ahad Ali, Badih Jawad, Daw Al-Werfalli Mohamed Nasser, Sabah Abro

Abstract:

Industries in general have a lot of waste. Wool textile company, Baniwalid, Libya has many complex problems that led to enormous waste generated due to the lack of lean strategies, expertise, technical support and commitment. To successfully address waste at wool textile company, this study will attempt to develop a methodical approach that integrates lean manufacturing tools to optimize performance characteristics such as lead time and delivery. This methodology will utilize Value Stream Mapping (VSM) techniques to identify the process variables that affect production. Once these variables are identified, Design of Experiments (DOE) Methodology will be used to determine the significantly influential process variables, these variables are then controlled and set at their optimal to achieve optimal levels of productivity, quality, agility, efficiency and delivery to analyze the outputs of the simulation model for different lean configurations. The goal of this research is to investigate how the tools of lean manufacturing can be adapted from the discrete to the continuous manufacturing environment and to evaluate their benefits at a specific industrial.

Keywords: lean manufacturing, DOE, value stream mapping, textiles

Procedia PDF Downloads 436
7531 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading

Authors: Robert Caulk

Abstract:

A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.

Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration

Procedia PDF Downloads 72
7530 Electric Field Analysis of XLPE, Cross-Linked Polyethylene Covered Aerial Line and Insulator Lashing

Authors: Jyh-Cherng Gu, Ming-Ta Yang, Dai-Ling Tsai

Abstract:

Both sparse lashing and dense lashing are applied to secure overhead XLPE (cross-linked polyethylene) covered power lines on ceramic insulators or HDPE polymer insulators. The distribution of electric field in and among the lashing wires, the XLPE power lines and insulators in normal clean condition and when conducting materials such as salt, metal particles, dust, smoke or acidic smog are present is studied in this paper. The ANSYS Maxwell commercial software is used in this study for electric field analysis. Although the simulation analysis is performed assuming ideal conditions due to the constraints of the simulation software, the result may not be the same as in real situation but still be of sufficient practical values.

Keywords: electric field intensity, insulator, XLPE covered aerial line, empty

Procedia PDF Downloads 252
7529 Model-Based Software Regression Test Suite Reduction

Authors: Shiwei Deng, Yang Bao

Abstract:

In this paper, we present a model-based regression test suite reducing approach that uses EFSM model dependence analysis and probability-driven greedy algorithm to reduce software regression test suites. The approach automatically identifies the difference between the original model and the modified model as a set of elementary model modifications. The EFSM dependence analysis is performed for each elementary modification to reduce the regression test suite, and then the probability-driven greedy algorithm is adopted to select the minimum set of test cases from the reduced regression test suite that cover all interaction patterns. Our initial experience shows that the approach may significantly reduce the size of regression test suites.

Keywords: dependence analysis, EFSM model, greedy algorithm, regression test

Procedia PDF Downloads 409
7528 Morphology and Permeability of Biomimetic Cellulose Triacetate-Impregnated Membranes: in situ Synchrotron Imaging and Experimental Studies

Authors: Amira Abdelrasoul

Abstract:

This study aimed to ascertain the controlled permeability of biomimetic cellulose triacetate (CTA) membranes by investigating the electrical oscillatory behavior across impregnated membranes (IM). The biomimetic CTA membranes were infused with a fatty acid to induce electrical oscillatory behavior and, hence, to ensure controlled permeability. In situ synchrotron radiation micro-computed tomography (SR-μCT) at the BioMedical Imaging and Therapy (BMIT) Beamline at the Canadian Light Source (CLS) was used to evaluate the main morphology of IMs compared to neat CTA membranes to ensure fatty acid impregnation inside the pores of the membrane matrices. A monochromatic beam at 20 keV was used for the visualization of the morphology of the membrane. The X-ray radiographs were recorded by means of a beam monitor AA-40 (500 μm LuAG scintillator, Hamamatsu, Japan) coupled with a high-resolution camera, providing a pixel size of 5.5 μm and a field of view (FOV) of 4.4 mm × 2.2 mm. Changes were evident in the phase transition temperatures of the impregnated CTA membrane at the melting temperature of the fatty acid. The pulsations of measured voltages were related to changes in the salt concentration of KCl in the vicinity of the electrode. Amplitudes and frequencies of voltage pulsations were dependent on the temperature and concentration of the KCl solution, which controlled the permeability of the biomimetic membranes. The presented smart biomimetic membrane successfully combined porous polymer support and impregnating liquid not only imitate the main barrier properties of the biological membranes but could be easily modified to achieve some new properties, such as facilitated and active transport, regulation by chemical, physical and pharmaceutical factors. These results open new frontiers for the facilitation and regulation of active transport and permeability through biomimetic smart membranes for a variety of biomedical and drug delivery applications.

Keywords: biomimetic, membrane, synchrotron, permeability, morphology

Procedia PDF Downloads 84
7527 The Effects of Information Technology in Urban Health

Authors: Safdari Reza, Zahmatkeshan Maryam, Goli Arji

Abstract:

Background and Aim: Urban health is one of the challenges of the 21st century. Rapid growth and expanding urbanization have implications for health. In this regard, information technology can remove a large number of modern cities’ problems. Therefore, the present article aims to study modern information technologies in the development of urban health. Materials and Methods:. This is a review article based on library research and Internet searches on valid websites such as Science Direct, Magiran, Springer and advanced searches in Google. Some 164 domestic and foreign texts were studied on such topics as the application of ICT tools including cell phones and wireless tools, GIS, and RFID in the field of urban health in 2011. Finally, 30 sources were used. Conclusion: Information and communication technologies play an important role in improving people's health and enhancing the quality of their lives. Effective utilization of information and communication technologies requires the identification of opportunities and constraints, and the formulation of appropriate planning principles with regard to social and economic factors together with preparing the technological, communication and telecommunications, legal and administrative infrastructures.

Keywords: Urban Health, Information Technology, Information & Communication, Technology

Procedia PDF Downloads 441
7526 Usage of Military Continuity Management System for Flooding Solution

Authors: Jiri Palecek, Radmila Hajkova, Alena Oulehlova, Hana Malachova

Abstract:

The increase of emergency incidents and crisis situations requires proactive crisis management of authorities and for its solution. Application business continuity management systems help the crisis management authorities quickly and responsibly react to events and to plan more effectively and efficiently powers and resources. The main goal of this article is describing Military Continuity Management System (MCMS) based on the principles of Business Continuity Management System (BCMS) for dealing with floods in the territory of the selected municipalities. There are explained steps of loading, running and evaluating activities in the software application MCMS. Software MCMS provides complete control over the tasks, contribute a comprehensive and responsible approach solutions to solution floods in the municipality.

Keywords: business continuity management, floods plan, flood activity, level of flood activity

Procedia PDF Downloads 261
7525 An Efficient Hardware/Software Workflow for Multi-Cores Simulink Applications

Authors: Asma Rebaya, Kaouther Gasmi, Imen Amari, Salem Hasnaoui

Abstract:

Over these last years, applications such as telecommunications, signal processing, digital communication with advanced features (Multi-antenna, equalization..) witness a rapid evaluation accompanied with an increase of user exigencies in terms of latency, the power of computation… To satisfy these requirements, the use of hardware/software systems is a common solution; where hardware is composed of multi-cores and software is represented by models of computation, synchronous data flow (SDF) graph for instance. Otherwise, the most of the embedded system designers utilize Simulink for modeling. The issue is how to simplify the c code generation, for a multi-cores platform, of an application modeled by Simulink. To overcome this problem, we propose a workflow allowing an automatic transformation from the Simulink model to the SDF graph and providing an efficient schedule permitting to optimize the number of cores and to minimize latency. This workflow goes from a Simulink application and a hardware architecture described by IP.XACT language. Based on the synchronous and hierarchical behavior of both models, the Simulink block diagram is automatically transformed into an SDF graph. Once this process is successfully achieved, the scheduler calculates the optimal cores’ number needful by minimizing the maximum density of the whole application. Then, a core is chosen to execute a specific graph task in a specific order and, subsequently, a compatible C code is generated. In order to perform this proposal, we extend Preesm, a rapid prototyping tool, to take the Simulink model as entry input and to support the optimal schedule. Afterward, we compared our results to this tool results, using a simple illustrative application. The comparison shows that our results strictly dominate the Preesm results in terms of number of cores and latency. In fact, if Preesm needs m processors and latency L, our workflow need processors and latency L'< L.

Keywords: hardware/software system, latency, modeling, multi-cores platform, scheduler, SDF graph, Simulink model, workflow

Procedia PDF Downloads 250