Search results for: multivariate time series data
35746 Using Deep Learning Real-Time Object Detection Convolution Neural Networks for Fast Fruit Recognition in the Tree
Authors: K. Bresilla, L. Manfrini, B. Morandi, A. Boini, G. Perulli, L. C. Grappadelli
Abstract:
Image/video processing for fruit in the tree using hard-coded feature extraction algorithms have shown high accuracy during recent years. While accurate, these approaches even with high-end hardware are computationally intensive and too slow for real-time systems. This paper details the use of deep convolution neural networks (CNNs), specifically an algorithm (YOLO - You Only Look Once) with 24+2 convolution layers. Using deep-learning techniques eliminated the need for hard-code specific features for specific fruit shapes, color and/or other attributes. This CNN is trained on more than 5000 images of apple and pear fruits on 960 cores GPU (Graphical Processing Unit). Testing set showed an accuracy of 90%. After this, trained data were transferred to an embedded device (Raspberry Pi gen.3) with camera for more portability. Based on correlation between number of visible fruits or detected fruits on one frame and the real number of fruits on one tree, a model was created to accommodate this error rate. Speed of processing and detection of the whole platform was higher than 40 frames per second. This speed is fast enough for any grasping/harvesting robotic arm or other real-time applications.Keywords: artificial intelligence, computer vision, deep learning, fruit recognition, harvesting robot, precision agriculture
Procedia PDF Downloads 42035745 HIV and AIDS in Kosovo, Stigma Persist!
Authors: Luljeta Gashi, Naser Ramadani, Zana Deva, Dafina Gexha-Bunjaku
Abstract:
The official HIV/AIDS data in Kosovo are based on HIV case reporting from health-care services, the blood transfusion system and Voluntary Counselling and Testing centres. Between 1986 and 2014, are reported 95 HIV and AIDS cases, of which 49 were AIDS, 46 HIV and 40 deaths. The majority (69%) of cases were men, age group 25 to 34 (37%) and route of transmission is: heterosexual (90%), MSM (7%), vertical transmission (2%) and IDU (1%). Based on existing data and the UNAIDS classification system, Kosovo is currently still categorised as having a low-level HIV epidemic. Even though with a low HIV prevalence, Kosovo faces a number of threatening factors, including increased number of drug users, a stigmatized and discriminated MSM community, high percentage of youth among general population (57% of the population under the age of 25), with changing social norms and especially the sexual ones. Methods: Data collection was done using self administered structured questionnaires amongst 249 high school students. Data were analysed using the Statistical Package for Social Sciences (SPSS). Results: The findings revealed that 68% of students know that HIV transmission can be reduced by having sex with only one uninfected partner who has no other partners, 94% know that the risk of getting HIV can be reduced by using a condom every time they have sex, 68% know that a person cannot get HIV from mosquito bites, 81% know that they cannot get HIV by sharing food with someone who is infected and 46% know that a healthy looking person can have HIV. Conclusions: Seventy one percent of high school students correctly identify ways of preventing the sexual transmission of HIV and who reject the major misconceptions about HIV transmission. The findings of the study indicate a need for more health education and promotion.Keywords: Kosovo, KPAR, HIV, high school
Procedia PDF Downloads 47735744 Assessment of Association Between Microalbuminuria and Lung Function Test Among the Community of Jimma Town
Authors: Diriba Dereje
Abstract:
Background: Cardiac and renal disease are the most prevalent chronic non-communicable diseases (CNCD) affecting the community in a significant manner. The best and recommended method in halting CNCD is by working on prevention as early as possible. This is only possible if early surrogate markers are identified. As part of the stated solution, this study will identify an association between microalbuminuria (an early surrogate marker of renal and cardiac disease) and lung function test among adult in the community. Objective: The main aim of this study was to assess an association between microalbuminuria (an early surrogate marker of renal and cardiac disease) and lung function test among adult in the community. Methodology: Community based cross sectional study was conducted among 384 adult in Jimma town. A systematic sampling technique was used in selecting participants to the study. In searching for the possible association, binary and multivariate logistic regression and t-test was conducted. Finally, the association between microalbuminuria and lung function test was well stated in the form of figures and written description. Result and Conclusion: A significant association was found between microalbuminuria and different lung function test parameters.Keywords: microalbuminuria, lung function, association, test
Procedia PDF Downloads 19035743 On-The-Fly Cross Sections Generation in Neutron Transport with Wide Energy Region
Authors: Rui Chen, Shu-min Zhou, Xiong-jie Zhang, Ren-bo Wang, Fan Huang, Bin Tang
Abstract:
During the temperature changes in reactor core, the nuclide cross section in reactor can vary with temperature, which eventually causes the changes of reactivity. To simulate the interaction between incident neutron and various materials at different temperatures on the nose, it is necessary to generate all the relevant reaction temperature-dependent cross section. Traditionally, the real time cross section generation method is used to avoid storing huge data but contains severe problems of low efficiency and adaptability for narrow energy region. Focused on the research on multi-temperature cross sections generation in real time during in neutron transport, this paper investigated the on-the-fly cross section generation method for resolved resonance region, thermal region and unresolved resonance region, and proposed the real time multi-temperature cross sections generation method based on double-exponential formula for resolved resonance region, as well as the Neville interpolation for thermal and unresolved resonance region. To prove the correctness and validity of multi-temperature cross sections generation based on wide energy region of incident neutron, the proposed method was applied in critical safety benchmark tests, which showed the capability for application in reactor multi-physical coupling simulation.Keywords: cross section, neutron transport, numerical simulation, on-the-fly
Procedia PDF Downloads 19635742 Nowcasting Indonesian Economy
Authors: Ferry Kurniawan
Abstract:
In this paper, we nowcast quarterly output growth in Indonesia by exploiting higher frequency data (monthly indicators) using a mixed-frequency factor model and exploiting both quarterly and monthly data. Nowcasting quarterly GDP in Indonesia is particularly relevant for the central bank of Indonesia which set the policy rate in the monthly Board of Governors Meeting; whereby one of the important step is the assessment of the current state of the economy. Thus, having an accurate and up-to-date quarterly GDP nowcast every time new monthly information becomes available would clearly be of interest for central bank of Indonesia, for example, as the initial assessment of the current state of the economy -including nowcast- will be used as input for longer term forecast. We consider a small scale mixed-frequency factor model to produce nowcasts. In particular, we specify variables as year-on-year growth rates thus the relation between quarterly and monthly data is expressed in year-on-year growth rates. To assess the performance of the model, we compare the nowcasts with two other approaches: autoregressive model –which is often difficult when forecasting output growth- and Mixed Data Sampling (MIDAS) regression. In particular, both mixed frequency factor model and MIDAS nowcasts are produced by exploiting the same set of monthly indicators. Hence, we compare the nowcasts performance of the two approaches directly. To preview the results, we find that by exploiting monthly indicators using mixed-frequency factor model and MIDAS regression we improve the nowcast accuracy over a benchmark simple autoregressive model that uses only quarterly frequency data. However, it is not clear whether the MIDAS or mixed-frequency factor model is better. Neither set of nowcasts encompasses the other; suggesting that both nowcasts are valuable in nowcasting GDP but neither is sufficient. By combining the two individual nowcasts, we find that the nowcast combination not only increases the accuracy - relative to individual nowcasts- but also lowers the risk of the worst performance of the individual nowcasts.Keywords: nowcasting, mixed-frequency data, factor model, nowcasts combination
Procedia PDF Downloads 33135741 Predictive Maintenance of Industrial Shredders: Efficient Operation through Real-Time Monitoring Using Statistical Machine Learning
Authors: Federico Pittino, Thomas Arnold
Abstract:
The shredding of waste materials is a key step in the recycling process towards the circular economy. Industrial shredders for waste processing operate in very harsh operating conditions, leading to the need for frequent maintenance of critical components. Maintenance optimization is particularly important also to increase the machine’s efficiency, thereby reducing the operational costs. In this work, a monitoring system has been developed and deployed on an industrial shredder located at a waste recycling plant in Austria. The machine has been monitored for one year, and methods for predictive maintenance have been developed for two key components: the cutting knives and the drive belt. The large amount of collected data is leveraged by statistical machine learning techniques, thereby not requiring very detailed knowledge of the machine or its live operating conditions. The results show that, despite the wide range of operating conditions, a reliable estimate of the optimal time for maintenance can be derived. Moreover, the trade-off between the cost of maintenance and the increase in power consumption due to the wear state of the monitored components of the machine is investigated. This work proves the benefits of real-time monitoring system for the efficient operation of industrial shredders.Keywords: predictive maintenance, circular economy, industrial shredder, cost optimization, statistical machine learning
Procedia PDF Downloads 12235740 Cross-Cultural Competence Development through 'Learning by Reflection': A Case Study of Chinese International Students Learning through Taking Part-Time Jobs in the UK
Authors: Xin Zhao
Abstract:
The project aims to expand the notion of narrative learning and address the importance of learning by reflection in our learning and teaching context at a British university. Drawing on the key concepts such as development ZPD, transition and reflection-in and –on-action, this project analyses the learning experiences of a small sample of Chinese postgraduate students in a British University, who use part-time job experience to develop cross-cultural communication skills. The project adopts a mixed methods approach. Questionnaires and focus group interviews are used to examine the way in which students adapt (or not adapt) to the culture of learning in a British university and develop a renewed sense of self in transitions from one culture to the other. The project also looks at how the students appropriate opportunities for learning not just from classrooms but outside classrooms from everyday encounters. The project aims to address the implication of learning by reflection as development in transition. Time in and for learning, or duration, is taken for granted in theorising narrative learning. The project shall explore this very issue of time in relation to learning by reflection in considering time in/of/for learning as duration.Keywords: cross-cultural competence, learning by refection, international student transition, part-time work experience
Procedia PDF Downloads 18235739 Steps towards the Development of National Health Data Standards in Developing Countries
Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian Murray
Abstract:
The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.Keywords: interoperabilty, medical data exchange, health data standards, case study, Saudi Arabia
Procedia PDF Downloads 33835738 Evaluation of Hand Grip Strength and EMG Signal on Visual Reaction
Authors: Sung-Wook Shin, Sung-Taek Chung
Abstract:
Hand grip strength has been utilized as an indicator to evaluate the motor ability of hands, responsible for performing multiple body functions. It is, however, difficult to evaluate other factors (other than hand muscular strength) utilizing the hand grip strength only. In this study, we analyzed the motor ability of hands using EMG and the hand grip strength, simultaneously in order to evaluate concentration, muscular strength reaction time, instantaneous muscular strength change, and agility in response to visual reaction. In results, the average time (and their standard deviations) of muscular strength reaction EMG signal and hand grip strength was found to be 209.6 ± 56.2 ms and 354.3 ± 54.6 ms, respectively. In addition, the onset time which represents acceleration time to reach 90% of maximum hand grip strength, was 382.9 ± 129.9 ms.Keywords: hand grip strength, EMG, visual reaction, endurance
Procedia PDF Downloads 46235737 Number of Parametrization of Discrete-Time Systems without Unit-Delay Element: Single-Input Single-Output Case
Authors: Kazuyoshi Mori
Abstract:
In this paper, we consider the parametrization of the discrete-time systems without the unit-delay element within the framework of the factorization approach. In the parametrization, we investigate the number of required parameters. We consider single-input single-output systems in this paper. By the investigation, we find, on the discrete-time systems without the unit-delay element, three cases that are (1) there exist plants which require only one parameter and (2) two parameters, and (3) the number of parameters is at most three.Keywords: factorization approach, discrete-time system, parameterization of stabilizing controllers, system without unit-delay
Procedia PDF Downloads 24035736 Low Cost Inertial Sensors Modeling Using Allan Variance
Authors: A. A. Hussen, I. N. Jleta
Abstract:
Micro-electromechanical system (MEMS) accelerometers and gyroscopes are suitable for the inertial navigation system (INS) of many applications due to the low price, small dimensions and light weight. The main disadvantage in a comparison with classic sensors is a worse long term stability. The estimation accuracy is mostly affected by the time-dependent growth of inertial sensor errors, especially the stochastic errors. In order to eliminate negative effect of these random errors, they must be accurately modeled. Where the key is the successful implementation that depends on how well the noise statistics of the inertial sensors is selected. In this paper, the Allan variance technique will be used in modeling the stochastic errors of the inertial sensors. By performing a simple operation on the entire length of data, a characteristic curve is obtained whose inspection provides a systematic characterization of various random errors contained in the inertial-sensor output data.Keywords: Allan variance, accelerometer, gyroscope, stochastic errors
Procedia PDF Downloads 44235735 FLIME - Fast Low Light Image Enhancement for Real-Time Video
Authors: Vinay P., Srinivas K. S.
Abstract:
Low Light Image Enhancement is of utmost impor- tance in computer vision based tasks. Applications include vision systems for autonomous driving, night vision devices for defence systems, low light object detection tasks. Many of the existing deep learning methods are resource intensive during the inference step and take considerable time for processing. The algorithm should take considerably less than 41 milliseconds in order to process a real-time video feed with 24 frames per second and should be even less for a video with 30 or 60 frames per second. The paper presents a fast and efficient solution which has two main advantages, it has the potential to be used for a real-time video feed, and it can be used in low compute environments because of the lightweight nature. The proposed solution is a pipeline of three steps, the first one is the use of a simple function to map input RGB values to output RGB values, the second is to balance the colors and the final step is to adjust the contrast of the image. Hence a custom dataset is carefully prepared using images taken in low and bright lighting conditions. The preparation of the dataset, the proposed model, the processing time are discussed in detail and the quality of the enhanced images using different methods is shown.Keywords: low light image enhancement, real-time video, computer vision, machine learning
Procedia PDF Downloads 20435734 Machine Learning Predictive Models for Hydroponic Systems: A Case Study Nutrient Film Technique and Deep Flow Technique
Authors: Kritiyaporn Kunsook
Abstract:
Machine learning algorithms (MLAs) such us artificial neural networks (ANNs), decision tree, support vector machines (SVMs), Naïve Bayes, and ensemble classifier by voting are powerful data driven methods that are relatively less widely used in the mapping of technique of system, and thus have not been comparatively evaluated together thoroughly in this field. The performances of a series of MLAs, ANNs, decision tree, SVMs, Naïve Bayes, and ensemble classifier by voting in technique of hydroponic systems prospectively modeling are compared based on the accuracy of each model. Classification of hydroponic systems only covers the test samples from vegetables grown with Nutrient film technique (NFT) and Deep flow technique (DFT). The feature, which are the characteristics of vegetables compose harvesting height width, temperature, require light and color. The results indicate that the classification performance of the ANNs is 98%, decision tree is 98%, SVMs is 97.33%, Naïve Bayes is 96.67%, and ensemble classifier by voting is 98.96% algorithm respectively.Keywords: artificial neural networks, decision tree, support vector machines, naïve Bayes, ensemble classifier by voting
Procedia PDF Downloads 37235733 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-
Authors: Nieto Bernal Wilson, Carmona Suarez Edgar
Abstract:
The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects. Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse
Procedia PDF Downloads 40935732 An Improved Atmospheric Correction Method with Diurnal Temperature Cycle Model for MSG-SEVIRI TIR Data under Clear Sky Condition
Authors: Caixia Gao, Chuanrong Li, Lingli Tang, Lingling Ma, Yonggang Qian, Ning Wang
Abstract:
Knowledge of land surface temperature (LST) is of crucial important in energy balance studies and environment modeling. Satellite thermal infrared (TIR) imagery is the primary source for retrieving LST at the regional and global scales. Due to the combination of atmosphere and land surface of received radiance by TIR sensors, atmospheric effect correction has to be performed to remove the atmospheric transmittance and upwelling radiance. Spinning Enhanced Visible and Infrared Imager (SEVIRI) onboard Meteosat Second Generation (MSG) provides measurements every 15 minutes in 12 spectral channels covering from visible to infrared spectrum at fixed view angles with 3km pixel size at nadir, offering new and unique capabilities for LST, LSE measurements. However, due to its high temporal resolution, the atmosphere correction could not be performed with radiosonde profiles or reanalysis data since these profiles are not available at all SEVIRI TIR image acquisition times. To solve this problem, a two-part six-parameter semi-empirical diurnal temperature cycle (DTC) model has been applied to the temporal interpolation of ECMWF reanalysis data. Due to the fact that the DTC model is underdetermined with ECMWF data at four synoptic times (UTC times: 00:00, 06:00, 12:00, 18:00) in one day for each location, some approaches are adopted in this study. It is well known that the atmospheric transmittance and upwelling radiance has a relationship with water vapour content (WVC). With the aid of simulated data, the relationship could be determined under each viewing zenith angle for each SEVIRI TIR channel. Thus, the atmospheric transmittance and upwelling radiance are preliminary removed with the aid of instantaneous WVC, which is retrieved from the brightness temperature in the SEVIRI channels 5, 9 and 10, and a group of the brightness temperatures for surface leaving radiance (Tg) are acquired. Subsequently, a group of the six parameters of the DTC model is fitted with these Tg by a Levenberg-Marquardt least squares algorithm (denoted as DTC model 1). Although the retrieval error of WVC and the approximate relationships between WVC and atmospheric parameters would induce some uncertainties, this would not significantly affect the determination of the three parameters, td, ts and β (β is the angular frequency, td is the time where the Tg reaches its maximum, ts is the starting time of attenuation) in DTC model. Furthermore, due to the large fluctuation in temperature and the inaccuracy of the DTC model around sunrise, SEVIRI measurements from two hours before sunrise to two hours after sunrise are excluded. With the knowledge of td , ts, and β, a new DTC model (denoted as DTC model 2) is accurately fitted again with these Tg at UTC times: 05:57, 11:57, 17:57 and 23:57, which is atmospherically corrected with ECMWF data. And then a new group of the six parameters of the DTC model is generated and subsequently, the Tg at any given times are acquired. Finally, this method is applied to SEVIRI data in channel 9 successfully. The result shows that the proposed method could be performed reasonably without assumption and the Tg derived with the improved method is much more consistent with that from radiosonde measurements.Keywords: atmosphere correction, diurnal temperature cycle model, land surface temperature, SEVIRI
Procedia PDF Downloads 26835731 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis
Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee
Abstract:
In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences
Procedia PDF Downloads 74335730 The Automatic Transliteration Model of Images of the Book Hamong Tani Using Statistical Approach
Authors: Agustinus Rudatyo Himamunanto, Anastasia Rita Widiarti
Abstract:
Transliteration using Javanese manuscripts is one of methods to preserve and legate the wealth of literature in the past for the present generation in Indonesia. The transliteration manual process commonly requires philologists and takes a relatively long time. The automatic transliteration process is expected to shorten the time so as to help the works of philologists. The preprocessing and segmentation stage firstly done is used to manage the document images, thus obtaining image script units that will compile input document images free from noise and have the similarity in properties in the thickness, size, and slope. The next stage of characteristic extraction is used to find unique characteristics that will distinguish each Javanese script image. One of characteristics that is used in this research is the number of black pixels in each image units. Each image of Java scripts contained in the data training will undergo the same process similar to the input characters. The system testing was performed with the data of the book Hamong Tani. The book Hamong Tani was selected due to its content, age and number of pages. Those were considered sufficient as a model experimental input. Based on the results of random page automatic transliteration process testing, it was determined that the maximum percentage correctness obtained was 81.53%. The percentage of success was obtained in 32x32 pixel input image size with the 5x5 image window. With regard to the results, it can be concluded that the automatic transliteration model offered is relatively good.Keywords: Javanese script, character recognition, statistical, automatic transliteration
Procedia PDF Downloads 33935729 Big Data: Appearance and Disappearance
Authors: James Moir
Abstract:
The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.Keywords: big data, appearance, disappearance, surface, epistemology
Procedia PDF Downloads 42035728 From Data Processing to Experimental Design and Back Again: A Parameter Identification Problem Based on FRAP Images
Authors: Stepan Papacek, Jiri Jablonsky, Radek Kana, Ctirad Matonoha, Stefan Kindermann
Abstract:
FRAP (Fluorescence Recovery After Photobleaching) is a widely used measurement technique to determine the mobility of fluorescent molecules within living cells. While the experimental setup and protocol for FRAP experiments are usually fixed, data processing part is still under development. In this paper, we formulate and solve the problem of data selection which enhances the processing of FRAP images. We introduce the concept of the irrelevant data set, i.e., the data which are almost not reducing the confidence interval of the estimated parameters and thus could be neglected. Based on sensitivity analysis, we both solve the problem of the optimal data space selection and we find specific conditions for optimizing an important experimental design factor, e.g., the radius of bleach spot. Finally, a theorem announcing less precision of the integrated data approach compared to the full data case is proven; i.e., we claim that the data set represented by the FRAP recovery curve lead to a larger confidence interval compared to the spatio-temporal (full) data.Keywords: FRAP, inverse problem, parameter identification, sensitivity analysis, optimal experimental design
Procedia PDF Downloads 27835727 Exploring the Feasibility of Utilizing Blockchain in Cloud Computing and AI-Enabled BIM for Enhancing Data Exchange in Construction Supply Chain Management
Authors: Tran Duong Nguyen, Marwan Shagar, Qinghao Zeng, Aras Maqsoodi, Pardis Pishdad, Eunhwa Yang
Abstract:
Construction supply chain management (CSCM) involves the collaboration of many disciplines and actors, which generates vast amounts of data. However, inefficient, fragmented, and non-standardized data storage often hinders this data exchange. The industry has adopted building information modeling (BIM) -a digital representation of a facility's physical and functional characteristics to improve collaboration, enhance transmission security, and provide a common data exchange platform. Still, the volume and complexity of data require tailored information categorization, aligning with stakeholders' preferences and demands. To address this, artificial intelligence (AI) can be integrated to handle this data’s magnitude and complexities. This research aims to develop an integrated and efficient approach for data exchange in CSCM by utilizing AI. The paper covers five main objectives: (1) Investigate existing framework and BIM adoption; (2) Identify challenges in data exchange; (3) Propose an integrated framework; (4) Enhance data transmission security; and (5) Develop data exchange in CSCM. The proposed framework demonstrates how integrating BIM and other technologies, such as cloud computing, blockchain, and AI applications, can significantly improve the efficiency and accuracy of data exchange in CSCM.Keywords: construction supply chain management, BIM, data exchange, artificial intelligence
Procedia PDF Downloads 2635726 Woodcast is Ecologically Sound and Tolerated by a Majority of Patients
Authors: R. Hassan, J. Duncombe, E. Darke, A. Dias, K. Anderson, R. G. Middleton
Abstract:
NHS England has set itself the task of delivering a “Net Zero” National Health service by 2040. It is incumbent upon all health care practioners to work towards this goal. Orthopaedic surgeons are no exception. Distal radial fractures are the most common fractures sustained by the adult population. However, studies are shortcoming on individual patient experience. The aim of this study was to assess the patient’s satisfaction and outcomes with woodcast used in the conservative management of distal radius fractures. For all patients managed with woodcast in our unit, we undertook a structured questionnaire that included the Patient Rated Wrist Evaluation (PRWE) score, The EQ-5D-5L score and the pain numerical score at the time of injury and six weeks after. 30 patients were initially managed with woodcast. 80% of patients tolerated woodcast for the full duration of their treatment. Of these, 20% didn’t tolerate woodcast and had their casts removed within 48 hours. Of the remaining, 79.1% were satisfied about woodcast comfort, 66% were very satisfied about woodcast weight, 70% were satisfied with temperature and sweatiness, 62.5% were very satisfied about the smell/odour, and 75% were satisfied about the level of support woodcast provided. During their treatment, 83.3% of patients rated their pain as five or less. For those who completed their treatment in woodcast, none required any further intervention or utilised the open appointment because of ongoing wrist problems. In conclusion, when woodcast is tolerated, patients’ satisfaction and outcome levels were good. However, we acknowledged 20% of patients in our series were not able to tolerate woodacst, Therefore, we suggest a comparison between the widely used synthetic plaster of Paris casting and woodcast to come in order.Keywords: distal radius fractures, ecological cast, sustainability, woodcast
Procedia PDF Downloads 10035725 Green-Synthesized β-Cyclodextrin Membranes for Humidity Sensors
Authors: Zeineb Baatout, Safa Teka, Nejmeddine Jaballah, Nawfel Sakly, Xiaonan Sun, Mustapha Majdoub
Abstract:
Currently, the economic interests linked to the development of bio-based materials make biomass one of the most interesting areas for science development. We are interested in the β-cyclodextrin (β-CD), one of the popular bio-sourced macromolecule, produced from the starch via enzymatic conversion. It is a cyclic oligosaccharide formed by the association of seven glucose units. It presents a rigid conical and amphiphilic structure with hydrophilic exterior, allowing it to be water-soluble. It has also a hydrophobic interior enabling the formation of inclusion complexes, which support its application for the elaboration of electrochemical and optical sensors. Nevertheless, the solubility of β-CD in water makes its use as sensitive layer limit and difficult due to their instability in aqueous media. To overcome this limitation, we chose to precede by modification of the hydroxyl groups to obtain hydrophobic derivatives which lead to water-stable sensing layers. Hence, a series of benzylated β-CDs were synthesized in basic aqueous media in one pot. This work reports the synthesis of a new family of substituted amphiphilic β-CDs using a green methodology. The obtained β-CDs showed different degree of substitution (DS) between 0.85 and 2.03. These organic macromolecular materials were soluble in common organic volatile solvents, and their structures were investigated by NMR, FT-IR and MALDI-TOF spectroscopies. Thermal analysis showed a correlation between the thermal properties of these derivatives and the benzylation degree. The surface properties of the thin films based on the benzylated β-CDs were characterized by contact angle measurements and atomic force microscopy (AFM). These organic materials were investigated as sensitive layers, deposited on quartz crystal microbalance (QCM) gravimetric transducer, for humidity sensor at room temperature. The results showed that the performances of the prepared sensors are greatly influenced by the benzylation degree of β-CD. The partially modified β-CD (DS=1) shows linear response with best sensitivity, good reproducibility, low hysteresis, fast response time (15s) and recovery time (17s) at higher relative humidity levels (RH) between 11% and 98% in room temperature.Keywords: β-cyclodextrin, green synthesis, humidity sensor, quartz crystal microbalance
Procedia PDF Downloads 27135724 Investigating the Characteristics of Correlated Parking-Charging Behaviors for Electric Vehicles: A Data-Driven Approach
Authors: Xizhen Zhou, Yanjie Ji
Abstract:
In advancing the management of integrated electric vehicle (EV) parking-charging behaviors, this study uses Changshu City in Suzhou as a case study to establish a data association mechanism for parking-charging platforms and to develop a database for EV parking-charging behaviors. Key indicators, such as charging start time, initial state of charge, final state of charge, and parking-charging time difference, are considered. Utilizing the K-S test method, the paper examines the heterogeneity of parking-charging behavior preferences among pure EV and non-pure EV users. The K-means clustering method is employed to analyze the characteristics of parking-charging behaviors for both user groups, thereby enhancing the overall understanding of these behaviors. The findings of this study reveal that using a classification model, the parking-charging behaviors of pure EVs can be classified into five distinct groups, while those of non-pure EVs can be separated into four groups. Among them, both types of EV users exhibit groups with low range anxiety for complete charging with special journeys, complete charging at destination, and partial charging. Additionally, both types have a group with high range anxiety, characterized by pure EV users displaying a preference for complete charging with specific journeys, while non-pure EV users exhibit a preference for complete charging. Notably, pure EV users also display a significant group engaging in nocturnal complete charging. The findings of this study can provide technical support for the scientific and rational layout and management of integrated parking and charging facilities for EVs.Keywords: traffic engineering, potential preferences, cluster analysis, EV, parking-charging behavior
Procedia PDF Downloads 7735723 Data Mining As A Tool For Knowledge Management: A Review
Authors: Maram Saleh
Abstract:
Knowledge has become an essential resource in today’s economy and become the most important asset of maintaining competition advantage in organizations. The importance of knowledge has made organizations to manage their knowledge assets and resources through all multiple knowledge management stages such as: Knowledge Creation, knowledge storage, knowledge sharing and knowledge use. Researches on data mining are continues growing over recent years on both business and educational fields. Data mining is one of the most important steps of the knowledge discovery in databases process aiming to extract implicit, unknown but useful knowledge and it is considered as significant subfield in knowledge management. Data miming have the great potential to help organizations to focus on extracting the most important information on their data warehouses. Data mining tools and techniques can predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. This review paper explores the applications of data mining techniques in supporting knowledge management process as an effective knowledge discovery technique. In this paper, we identify the relationship between data mining and knowledge management, and then focus on introducing some application of date mining techniques in knowledge management for some real life domains.Keywords: Data Mining, Knowledge management, Knowledge discovery, Knowledge creation.
Procedia PDF Downloads 20835722 New Results on Exponential Stability of Hybrid Systems
Authors: Grienggrai Rajchakit
Abstract:
This paper is concerned with the exponential stability of switched linear systems with interval time-varying delays. The time delay is any continuous function belonging to a given interval, in which the lower bound of delay is not restricted to zero. By constructing a suitable augmented Lyapunov-Krasovskii functional combined with Leibniz-Newton's formula, a switching rule for the exponential stability of switched linear systems with interval time-varying delays and new delay-dependent sufficient conditions for the exponential stability of the systems are first established in terms of LMIs. Finally, some examples are exploited to illustrate the effectiveness of the proposed schemes.Keywords: exponential stability, hybrid systems, time-varying delays, lyapunov-krasovskii functional, leibniz-newton's formula
Procedia PDF Downloads 54335721 An Approaching Index to Evaluate a forward Collision Probability
Authors: Yuan-Lin Chen
Abstract:
This paper presents an approaching forward collision probability index (AFCPI) for alerting and assisting driver in keeping safety distance to avoid the forward collision accident in highway driving. The time to collision (TTC) and time headway (TH) are used to evaluate the TTC forward collision probability index (TFCPI) and the TH forward collision probability index (HFCPI), respectively. The Mamdani fuzzy inference algorithm is presented combining TFCPI and HFCPI to calculate the approaching collision probability index of the vehicle. The AFCPI is easier to understand for the driver who did not even have any professional knowledge in vehicle professional field. At the same time, the driver’s behavior is taken into account for suiting each driver. For the approaching index, the value 0 is indicating the 0% probability of forward collision, and the values 0.5 and 1 are indicating the 50% and 100% probabilities of forward collision, respectively. The AFCPI is useful and easy-to-understand for alerting driver to avoid the forward collision accidents when driving in highway.Keywords: approaching index, forward collision probability, time to collision, time headway
Procedia PDF Downloads 29335720 An Approach to Analyze Testing of Nano On-Chip Networks
Authors: Farnaz Fotovvatikhah, Javad Akbari
Abstract:
Test time of a test architecture is an important factor which depends on the architecture's delay and test patterns. Here a new architecture to store the test results based on network on chip is presented. In addition, simple analytical model is proposed to calculate link test time for built in self-tester (BIST) and external tester (Ext) in multiprocessor systems. The results extracted from the model are verified using FPGA implementation and experimental measurements. Systems consisting 16, 25, and 36 processors are implemented and simulated and test time is calculated. In addition, BIST and Ext are compared in terms of test time at different conditions such as at different number of test patterns and nodes. Using the model the maximum frequency of testing could be calculated and the test structure could be optimized for high speed testing.Keywords: test, nano on-chip network, JTAG, modelling
Procedia PDF Downloads 48835719 Implementation of Clinical Monitoring System of Physiological Parameters
Authors: Abdesselam Babouri, Ahcène Lemzadmi, M Rahmane, B. Belhadi, N. Abouchi
Abstract:
Medical monitoring aims at monitoring and remotely controlling the vital physiological parameters of the patient. The physiological sensors provide repetitive measurements of these parameters in the form of electrical signals that vary continuously over time. Various measures allow informing us about the health of the person's physiological data (weight, blood pressure, heart rate or specific to a disease), environmental conditions (temperature, humidity, light, noise level) and displacement and movements (physical efforts and the completion of major daily living activities). The collected data will allow monitoring the patient’s condition and alerting in case of modification. They are also used in the diagnosis and decision making on medical treatment and the health of the patient. This work presents the implementation of a monitoring system to be used for the control of physiological parameters.Keywords: clinical monitoring, physiological parameters, biomedical sensors, personal health
Procedia PDF Downloads 47335718 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme
Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara
Abstract:
This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme
Procedia PDF Downloads 48035717 Monitoring Blood Pressure Using Regression Techniques
Authors: Qasem Qananwah, Ahmad Dagamseh, Hiam AlQuran, Khalid Shaker Ibrahim
Abstract:
Blood pressure helps the physicians greatly to have a deep insight into the cardiovascular system. The determination of individual blood pressure is a standard clinical procedure considered for cardiovascular system problems. The conventional techniques to measure blood pressure (e.g. cuff method) allows a limited number of readings for a certain period (e.g. every 5-10 minutes). Additionally, these systems cause turbulence to blood flow; impeding continuous blood pressure monitoring, especially in emergency cases or critically ill persons. In this paper, the most important statistical features in the photoplethysmogram (PPG) signals were extracted to estimate the blood pressure noninvasively. PPG signals from more than 40 subjects were measured and analyzed and 12 features were extracted. The features were fed to principal component analysis (PCA) to find the most important independent features that have the highest correlation with blood pressure. The results show that the stiffness index means and standard deviation for the beat-to-beat heart rate were the most important features. A model representing both features for Systolic Blood Pressure (SBP) and Diastolic Blood Pressure (DBP) was obtained using a statistical regression technique. Surface fitting is used to best fit the series of data and the results show that the error value in estimating the SBP is 4.95% and in estimating the DBP is 3.99%.Keywords: blood pressure, noninvasive optical system, principal component analysis, PCA, continuous monitoring
Procedia PDF Downloads 161