Search results for: 99.95% IoT data transmission savings
20981 Times Series Analysis of Depositing in Industrial Design in Brazil between 1996 and 2013
Authors: Jonas Pedro Fabris, Alberth Almeida Amorim Souza, Maria Emilia Camargo, Suzana Leitão Russo
Abstract:
With the law Nº. 9279, of May 14, 1996, the Brazilian government regulates rights and obligations relating to industrial property considering the economic development of the country as granting patents, trademark registration, registration of industrial designs and other forms of protection copyright. In this study, we show the application of the methodology of Box and Jenkins in the series of deposits of industrial design at the National Institute of Industrial Property for the period from May 1996 to April 2013. First, a graphical analysis of the data was done by observing the behavior of the data and the autocorrelation function. The best model found, based on the analysis of charts and statistical tests suggested by Box and Jenkins methodology, it was possible to determine the model number for the deposit of industrial design, SARIMA (2,1,0)(2,0,0), with an equal to 9.88% MAPE.Keywords: ARIMA models, autocorrelation, Box and Jenkins Models, industrial design, MAPE, time series
Procedia PDF Downloads 54420980 Performance Analysis of the First-Order Characteristics of Polling System Based on Parallel Limited (K=1) Services Mode
Authors: Liu Yi, Bao Liyong
Abstract:
Aiming at the problem of low efficiency of pipelined scheduling in periodic query-qualified service, this paper proposes a system service resource scheduling strategy with parallel optimized qualified service polling control. The paper constructs the polling queuing system and its mathematical model; firstly, the first-order and second-order characteristic parameter equations are obtained by partial derivation of the probability mother function of the system state variables, and the complete mathematical, analytical expressions of each system parameter are deduced after the joint solution. The simulation experimental results are consistent with the theoretical calculated values. The system performance analysis shows that the average captain and average period of the system have been greatly improved, which can better adapt to the service demand of delay-sensitive data in the dense data environment.Keywords: polling, parallel scheduling, mean queue length, average cycle time
Procedia PDF Downloads 3920979 Investigating the Chemical Structure of Drinking Water in Domestic Areas of Kuwait by Appling GIS Technology
Authors: H. Al-Jabli
Abstract:
The research on the presence of heavy metals and bromate in drinking water is of immense scientific significance due to the potential risks these substances pose to public health. These contaminants are subject to regulatory limits outlined by the National Primary Drinking Water Regulations. Through a comprehensive analysis involving the compilation of existing data and the collection of new data via water sampling in residential areas of Kuwait, the aim is to create detailed maps illustrating the spatial distribution of these substances. Furthermore, the investigation will utilize GRAPHER software to explore correlations among different chemical parameters. By implementing rigorous scientific methodologies, the research will provide valuable insights for the Ministry of Electricity and Water and the Ministry of Health. These insights can inform evidence-based decision-making, facilitate the implementation of corrective measures, and support strategic planning for future infrastructure activities.Keywords: heavy metals, bromate, ozonation, GIS
Procedia PDF Downloads 8120978 Leveraging Digital Transformation Initiatives and Artificial Intelligence to Optimize Readiness and Simulate Mission Performance across the Fleet
Authors: Justin Woulfe
Abstract:
Siloed logistics and supply chain management systems throughout the Department of Defense (DOD) has led to disparate approaches to modeling and simulation (M&S), a lack of understanding of how one system impacts the whole, and issues with “optimal” solutions that are good for one organization but have dramatic negative impacts on another. Many different systems have evolved to try to understand and account for uncertainty and try to reduce the consequences of the unknown. As the DoD undertakes expansive digital transformation initiatives, there is an opportunity to fuse and leverage traditionally disparate data into a centrally hosted source of truth. With a streamlined process incorporating machine learning (ML) and artificial intelligence (AI), advanced M&S will enable informed decisions guiding program success via optimized operational readiness and improved mission success. One of the current challenges is to leverage the terabytes of data generated by monitored systems to provide actionable information for all levels of users. The implementation of a cloud-based application analyzing data transactions, learning and predicting future states from current and past states in real-time, and communicating those anticipated states is an appropriate solution for the purposes of reduced latency and improved confidence in decisions. Decisions made from an ML and AI application combined with advanced optimization algorithms will improve the mission success and performance of systems, which will improve the overall cost and effectiveness of any program. The Systecon team constructs and employs model-based simulations, cutting across traditional silos of data, aggregating maintenance, and supply data, incorporating sensor information, and applying optimization and simulation methods to an as-maintained digital twin with the ability to aggregate results across a system’s lifecycle and across logical and operational groupings of systems. This coupling of data throughout the enterprise enables tactical, operational, and strategic decision support, detachable and deployable logistics services, and configuration-based automated distribution of digital technical and product data to enhance supply and logistics operations. As a complete solution, this approach significantly reduces program risk by allowing flexible configuration of data, data relationships, business process workflows, and early test and evaluation, especially budget trade-off analyses. A true capability to tie resources (dollars) to weapon system readiness in alignment with the real-world scenarios a warfighter may experience has been an objective yet to be realized to date. By developing and solidifying an organic capability to directly relate dollars to readiness and to inform the digital twin, the decision-maker is now empowered through valuable insight and traceability. This type of educated decision-making provides an advantage over the adversaries who struggle with maintaining system readiness at an affordable cost. The M&S capability developed allows program managers to independently evaluate system design and support decisions by quantifying their impact on operational availability and operations and support cost resulting in the ability to simultaneously optimize readiness and cost. This will allow the stakeholders to make data-driven decisions when trading cost and readiness throughout the life of the program. Finally, sponsors are available to validate product deliverables with efficiency and much higher accuracy than in previous years.Keywords: artificial intelligence, digital transformation, machine learning, predictive analytics
Procedia PDF Downloads 16020977 Modeling and Simulation of Fluid Catalytic Cracking Process
Authors: Sungho Kim, Dae Shik Kim, Jong Min Lee
Abstract:
Fluid catalytic cracking (FCC) process is one of the most important process in modern refinery industry. This paper focuses on the fluid catalytic cracking (FCC) process. As the FCC process is difficult to model well, due to its non linearities and various interactions between its process variables, rigorous process modeling of whole FCC plant is demanded for control and plant-wide optimization of the plant. In this study, a process design for the FCC plant includes riser reactor, main fractionator, and gas processing unit was developed. A reactor model was described based on four-lumped kinetic scheme. Main fractionator, gas processing unit and other process units are designed to simulate real plant data, using a process flow sheet simulator, Aspen PLUS. The custom reactor model was integrated with the process flow sheet simulator to develop an integrated process model.Keywords: fluid catalytic cracking, simulation, plant data, process design
Procedia PDF Downloads 52920976 Efficient Subsurface Mapping: Automatic Integration of Ground Penetrating Radar with Geographic Information Systems
Authors: Rauf R. Hussein, Devon M. Ramey
Abstract:
Integrating Ground Penetrating Radar (GPR) with Geographic Information Systems (GIS) can provide valuable insights for various applications, such as archaeology, transportation, and utility locating. Although there has been progress toward automating the integration of GPR data with GIS, fully automatic integration has not been achieved yet. Additionally, manually integrating GPR data with GIS can be a time-consuming and error-prone process. In this study, actual, real-world GPR applications are presented, and a software named GPR-GIS 10 is created to interactively extract subsurface targets from GPR radargrams and automatically integrate them into GIS. With this software, it is possible to quickly and reliably integrate the two techniques to create informative subsurface maps. The results indicated that automatic integration of GPR with GIS can be an efficient tool to map and view any subsurface targets in their appropriate location in a 3D space with the needed precision. The findings of this study could help GPR-GIS integrators save time and reduce errors in many GPR-GIS applications.Keywords: GPR, GIS, GPR-GIS 10, drone technology, automation
Procedia PDF Downloads 9220975 The Relationship among Perceived Risk, Product Knowledge, Brand Image and the Insurance Purchase Intention of Taiwanese Working Holiday Youths
Authors: Wan-Ling Chang, Hsiu-Ju Huang, Jui-Hsiu Chang
Abstract:
In 2004, the Ministry of Foreign Affairs Taiwan launched ‘An Arrangement on Working Holiday Scheme’ with 15 countries including New Zealand, Japan, Canada, Germany, South Korea, Britain, Australia and others. The aim of the scheme is to allow young people to work and study English or other foreign languages. Each year, there are 30,000 Taiwanese youths applied for participating in the working holiday schemes. However, frequent accidents could cause huge medical expenses and post-delivery fee, which are usually unaffordable for most families. Therefore, this study explored the relationship among perceived risk toward working holiday, insurance product knowledge, brand image and insurance purchase intention for Taiwanese youths who plan to apply for working holiday. A survey questionnaire was distributed for data collection. A total of 316 questionnaires were collected for data analyzed. Data were analyzed using descriptive statistics, independent samples T-test, one-way ANOVA, correlation analysis, regression analysis and hierarchical regression methods of analysis and hypothesis testing. The results of this research indicate that perceived risk has a negative influence on insurance purchase intention. On the opposite, product knowledge has brand image has a positive influence on the insurance purchase intention. According to the mentioned results, practical implications were further addressed for insurance companies when developing a future marketing plan.Keywords: insurance product knowledges, insurance purchase intention, perceived risk, working holiday
Procedia PDF Downloads 25020974 The Evaluation of the Restructuring Process in Nursing Services by Nurses
Authors: Bilgen Özlük, Ülkü Baykal
Abstract:
The study was conducted with the aim of determining the evaluations of nurses directed at the restructuring process carried out in the nursing services of a private hospital, and reveal how they have been affected by this process, in an integrated manner between a prospective approach and methods of quantitative and qualitative research, and as a comparative study, comparing the changes over a period of three years. The sample for the study is comprised of all of the nurses employed at a private hospital, and data has been collected from 17 nurses (a total of 30 interviews) for the qualitative part 377 nurses in 2013 and 429 nurses in 2014 for the quantitative part. As vehicles of data collection, the study used a form directed at identifying the changes in the organisational and management structure of the hospital, a nurses' interview form, a questionnaire identifying the personal and occupational characteristics of the nurses, the "Minnesota Job Satisfaction Scale", the "Organisational Citizenship Behaviour Scale" and the "Organisational Trust Scale". Qualitative data by researchers, quantitative data was analysed using number and percentage tests, a t-test, and ANOVA, progressive analysis Tukey and regression tests. While in the qualitative part of the study the nurses stated in the first year of the restructuring that they were satisfied with their relationship with top level management, the increases in salaries and changes in the working environment such as the increase in the number of staff, in later years, they stated that there had been a fall in their satisfaction levels due to reasons such as nursing services instead of nurse practitioners in a position they are not satisfied that the director, nursing services outside the nursing profession appointment of persons to positions of management and the lack of appropriate training and competence of these persons, increases in the burden of work, insufficient salaries and the lack of a difference in the salaries of senior and more junior staff. On the other hand, in the quantitative part, it was found that there was no difference in the levels of job satisfaction and organisational trust in any of the two years, that as the level of organisational trust increased the level of job satisfaction also increased, and that as the levels of job satisfaction and organisational trust a positive impact on organisational citizenship behaviour also increased.Keywords: services, nursing management, re-structuring, job satisfaction, organisational citizenship behaviour, organisational trust
Procedia PDF Downloads 35620973 Agriculture Yield Prediction Using Predictive Analytic Techniques
Authors: Nagini Sabbineni, Rajini T. V. Kanth, B. V. Kiranmayee
Abstract:
India’s economy primarily depends on agriculture yield growth and their allied agro industry products. The agriculture yield prediction is the toughest task for agricultural departments across the globe. The agriculture yield depends on various factors. Particularly countries like India, majority of agriculture growth depends on rain water, which is highly unpredictable. Agriculture growth depends on different parameters, namely Water, Nitrogen, Weather, Soil characteristics, Crop rotation, Soil moisture, Surface temperature and Rain water etc. In our paper, lot of Explorative Data Analysis is done and various predictive models were designed. Further various regression models like Linear, Multiple Linear, Non-linear models are tested for the effective prediction or the forecast of the agriculture yield for various crops in Andhra Pradesh and Telangana states.Keywords: agriculture yield growth, agriculture yield prediction, explorative data analysis, predictive models, regression models
Procedia PDF Downloads 31420972 Enhanced Stability of Piezoelectric Crystalline Phase of Poly(Vinylidene Fluoride) (PVDF) and Its Copolymer upon Epitaxial Relationships
Authors: Devi Eka Septiyani Arifin, Jrjeng Ruan
Abstract:
As an approach to manipulate the performance of polymer thin film, epitaxy crystallization within polymer blends of poly(vinylidene fluoride) (PVDF) and its copolymer poly(vinylidene fluoride-trifluoroethylene) P(VDF-TrFE) was studied in this research, which involves the competition between phase separation and crystal growth of constitutive semicrystalline polymers. The unique piezoelectric feature of poly(vinylidene fluoride) crystalline phase is derived from the packing of molecular chains in all-trans conformation, which spatially arranges all the substituted fluorene atoms on one side of the molecular chain and hydrogen atoms on the other side. Therefore, the net dipole moment is induced across the lateral packing of molecular chains. Nevertheless, due to the mutual repulsion among fluorene atoms, this all-trans molecular conformation is not stable, and ready to change above curie temperature, where thermal energy is sufficient to cause segmental rotation. This research attempts to explore whether the epitaxial interactions between piezoelectric crystals and crystal lattice of hexamethylbenzene (HMB) crystalline platelet is able to stabilize this metastable all-trans molecular conformation or not. As an aromatic crystalline compound, the melt of HMB was surprisingly found able to dissolve the poly(vinylidene fluoride), resulting in homogeneous eutectic solution. Thus, after quenching this binary eutectic mixture to room temperature, subsequent heating or annealing processes were designed to explore the involve phase separation and crystallization behavior. The phase transition behaviors were observed in-situ by X-ray diffraction and differential scanning calorimetry (DSC). The molecular packing was observed via transmission electron microscope (TEM) and the principles of electron diffraction were brought to study the internal crystal structure epitaxially developed within thin films. Obtained results clearly indicated the occurrence of heteroepitaxy of PVDF/PVDF-TrFE on HMB crystalline platelet. Both the concentration of poly(vinylidene fluoride) and the mixing ratios of these two constitutive polymers have been adopted as the influential factors for studying the competition between the epitaxial crystallization of PVDF and P(VDF-TrFE) on HMB crystalline. Furthermore, the involved epitaxial relationship is to be deciphered and studied as a potential factor capable of guiding the wide spread of piezoelectric crystalline form.Keywords: epitaxy, crystallization, crystalline platelet, thin film and mixing ratio
Procedia PDF Downloads 22320971 How to Perform Proper Indexing?
Authors: Watheq Mansour, Waleed Bin Owais, Mohammad Basheer Kotit, Khaled Khan
Abstract:
Efficient query processing is one of the utmost requisites in any business environment to satisfy consumer needs. This paper investigates the various types of indexing models, viz. primary, secondary, and multi-level. The investigation is done under the ambit of various types of queries to which each indexing model performs with efficacy. This study also discusses the inherent advantages and disadvantages of each indexing model and how indexing models can be chosen based on a particular environment. This paper also draws parallels between various indexing models and provides recommendations that would help a Database administrator to zero-in on a particular indexing model attributed to the needs and requirements of the production environment. In addition, to satisfy industry and consumer needs attributed to the colossal data generation nowadays, this study has proposed two novel indexing techniques that can be used to index highly unstructured and structured Big Data with efficacy. The study also briefly discusses some best practices that the industry should follow in order to choose an indexing model that is apposite to their prerequisites and requirements.Keywords: indexing, hashing, latent semantic indexing, B-tree
Procedia PDF Downloads 15620970 Bluetooth Piconet System for Child Care Applications
Authors: Ching-Sung Wang, Teng-Wei Wang, Zhen-Ting Zheng
Abstract:
This study mainly concerns a safety device designed for child care. When children are out of sight or the caregivers cannot always pay attention to the situation, through the functions of this device, caregivers can immediately be informed to make sure that the children do not get lost or hurt, and thus, ensure their safety. Starting from this concept, a device is produced based on the relatively low-cost Bluetooth piconet system and a three-axis gyroscope sensor. This device can transmit data to a mobile phone app through Bluetooth, in order that the user can learn the situation at any time. By simply clipping the device in a pocket or on the waist, after switching on/starting the device, it will send data to the phone to detect the child’s fall and distance. Once the child is beyond the angle or distance set by the app, it will issue a warning to inform the phone owner.Keywords: children care, piconet system, three-axis gyroscope, distance detection, falls detection
Procedia PDF Downloads 59720969 Jordan, Towards Eliminating Preventable Maternal Deaths
Authors: Abdelmanie Suleimat, Nagham Abu Shaqra, Sawsan Majali, Issam Adawi, Heba Abo Shindi, Anas Al Mohtaseb
Abstract:
The Government of Jordan recognizes that maternal mortality constitutes a grave public health problem. Over the past two decades, there has been significant progress in improving the quality of maternal health services, resulting in improved maternal and child health outcomes. Despite these efforts, measurement and analysis of maternal mortality remained a challenge, with significant discrepancies from previous national surveys that inhibited accuracy. In response with support from USAID, the Jordan Maternal Mortality Surveillance Response (JMMSR) System was established to collect, analyze, and equip policymakers with data for decision-making guided by interdisciplinary multi-levelled advisory groups aiming to eliminate preventable maternal deaths, A 2016 Public Health Bylaw required the notification of deaths among women of reproductive age. The JMMSR system was launched in 2018 and continues annually, analyzing data received from health facilities, to guide policy to prevent avoidable deaths. To date, there have been four annual national maternal mortality reports (2018-2021). Data is collected, reviewed by advisory groups, and then consolidated in an annual report to inform and guide the Ministry of Health (MOH); JMMSR collects the necessary information to calculate an accurate maternal mortality ratio and assists in identifying leading causes and contributing factors for each maternal death. Based on this data, national response plans are created. A monitoring and evaluation plan was designed to define, track, and improve implementation through indicators. Over the past four years, one of these indicators, ‘percent of facilities notifying respective health directorates of all deaths of women of reproductive age,’ increased annually from 82.16%, 92.95%, and 92.50% to 97.02%, respectively. The Government of Jordan demonstrated commitment to the JMMSR system by designating the MOH to primarily host the system and lead the development and dissemination of policies and procedures to standardize implementation. The data was translated into practical and evidence-based recommendations. The successful impact of results deepened the understanding of maternal mortality in Jordan, which convinced the MOH to amend the Bylaw now mandating electronic reporting of all births and neonatal deaths from health facilities to empower the JMMSR system, by developing a stillbirths and neonatal mortality surveillance and response system.Keywords: maternal health, maternal mortality, preventable maternal deaths, maternal morbidity
Procedia PDF Downloads 3820968 Performance Evaluation and Comparison between the Empirical Mode Decomposition, Wavelet Analysis, and Singular Spectrum Analysis Applied to the Time Series Analysis in Atmospheric Science
Authors: Olivier Delage, Hassan Bencherif, Alain Bourdier
Abstract:
Signal decomposition approaches represent an important step in time series analysis, providing useful knowledge and insight into the data and underlying dynamics characteristics while also facilitating tasks such as noise removal and feature extraction. As most of observational time series are nonlinear and nonstationary, resulting of several physical processes interaction at different time scales, experimental time series have fluctuations at all time scales and requires the development of specific signal decomposition techniques. Most commonly used techniques are data driven, enabling to obtain well-behaved signal components without making any prior-assumptions on input data. Among the most popular time series decomposition techniques, most cited in the literature, are the empirical mode decomposition and its variants, the empirical wavelet transform and singular spectrum analysis. With increasing popularity and utility of these methods in wide ranging applications, it is imperative to gain a good understanding and insight into the operation of these algorithms. In this work, we describe all of the techniques mentioned above as well as their ability to denoise signals, to capture trends, to identify components corresponding to the physical processes involved in the evolution of the observed system and deduce the dimensionality of the underlying dynamics. Results obtained with all of these methods on experimental total ozone columns and rainfall time series will be discussed and comparedKeywords: denoising, empirical mode decomposition, singular spectrum analysis, time series, underlying dynamics, wavelet analysis
Procedia PDF Downloads 11620967 Discriminating Between Energy Drinks and Sports Drinks Based on Their Chemical Properties Using Chemometric Methods
Authors: Robert Cazar, Nathaly Maza
Abstract:
Energy drinks and sports drinks are quite popular among young adults and teenagers worldwide. Some concerns regarding their health effects – particularly those of the energy drinks - have been raised based on scientific findings. Differentiating between these two types of drinks by means of their chemical properties seems to be an instructive task. Chemometrics provides the most appropriate strategy to do so. In this study, a discrimination analysis of the energy and sports drinks has been carried out applying chemometric methods. A set of eleven samples of available commercial brands of drinks – seven energy drinks and four sports drinks – were collected. Each sample was characterized by eight chemical variables (carbohydrates, energy, sugar, sodium, pH, degrees Brix, density, and citric acid). The data set was standardized and examined by exploratory chemometric techniques such as clustering and principal component analysis. As a preliminary step, a variable selection was carried out by inspecting the variable correlation matrix. It was detected that some variables are redundant, so they can be safely removed, leaving only five variables that are sufficient for this analysis. They are sugar, sodium, pH, density, and citric acid. Then, a hierarchical clustering `employing the average – linkage criterion and using the Euclidian distance metrics was performed. It perfectly separates the two types of drinks since the resultant dendogram, cut at the 25% similarity level, assorts the samples in two well defined groups, one of them containing the energy drinks and the other one the sports drinks. Further assurance of the complete discrimination is provided by the principal component analysis. The projection of the data set on the first two principal components – which retain the 71% of the data information – permits to visualize the distribution of the samples in the two groups identified in the clustering stage. Since the first principal component is the discriminating one, the inspection of its loadings consents to characterize such groups. The energy drinks group possesses medium to high values of density, citric acid, and sugar. The sports drinks group, on the other hand, exhibits low values of those variables. In conclusion, the application of chemometric methods on a data set that features some chemical properties of a number of energy and sports drinks provides an accurate, dependable way to discriminate between these two types of beverages.Keywords: chemometrics, clustering, energy drinks, principal component analysis, sports drinks
Procedia PDF Downloads 10820966 The Hallmarks of War Propaganda: The Case of Russia-Ukraine Conflict
Authors: Veronika Solopova, Oana-Iuliana Popescu, Tim Landgraf, Christoph Benzmüller
Abstract:
Beginning in 2014, slowly building geopolitical tensions in Eastern Europe led to a full-blown conflict between the Russian Federation and Ukraine that generated an unprecedented amount of news articles and data from social media data, reflecting the opposing ideologies and narratives as a background and the essence of the ongoing war. These polarized informational campaigns have led to countless mutual accusations of misinformation and fake news, shaping an atmosphere of confusion and mistrust for many readers all over the world. In this study, we analyzed scraped news articles from Ukrainian, Russian, Romanian and English-speaking news outlets, on the eve of 24th of February 2022, compared to day five of the conflict (28th of February), to see how the media influenced and mirrored the changes in public opinion. We also contrast the sources opposing and supporting the stands of the Russian government in Ukrainian, Russian and Romanian media spaces. In a data-driven way, we describe how the narratives are spread throughout Eastern and Central Europe. We present predictive linguistic features surrounding war propaganda. Our results indicate that there are strong similarities in terms of rhetoric strategies in the pro-Kremlin media in both Ukraine and Russia, which, while being relatively neutral according to surface structure, use aggressive vocabulary. This suggests that automatic propaganda identification systems have to be tailored for each new case, as they have to rely on situationally specific words. Both Ukrainian and Russian outlets lean towards strongly opinionated news, pointing towards the use of war propaganda in order to achieve strategic goals.Keywords: linguistic, news, propaganda, Russia, ukraine
Procedia PDF Downloads 12020965 Theoretical Framework for Value Creation in Project Oriented Companies
Authors: Mariusz Hofman
Abstract:
The paper ‘Theoretical framework for value creation in Project-Oriented Companies’ is designed to determine, how organisations create value and whether this allows them to achieve market success. An assumption has been made that there are two routes to achieving this value. The first one is to create intangible assets (i.e. the resources of human, structural and relational capital), while the other one is to create added value (understood as the surplus of revenue over costs). It has also been assumed that the combination of the achieved added value and unique intangible assets translates to the success of a project-oriented company. The purpose of the paper is to present hypothetical and deductive model which describing the modus operandi of such companies and approach to model operationalisation. All the latent variables included in the model are theoretical constructs with observational indicators (measures). The existence of a latent variable (construct) and also submodels will be confirmed based on a covariance matrix which in turn is based on empirical data, being a set of observational indicators (measures). This will be achieved with a confirmatory factor analysis (CFA). Due to this statistical procedure, it will be verified whether the matrix arising from the adopted theoretical model differs statistically from the empirical matrix of covariance arising from the system of equations. The fit of the model with the empirical data will be evaluated using χ2, RMSEA and CFI (Comparative Fit Index). How well the theoretical model fits the empirical data is assessed through a number of indicators. If the theoretical conjectures are confirmed, an interesting development path can be defined for project-oriented companies. This will let such organisations perform efficiently in the face of the growing competition and pressure on innovation.Keywords: value creation, project-oriented company, structural equation modelling
Procedia PDF Downloads 29720964 Fashion Consumption for Fashion Innovators: A Study of Fashion Consumption Behavior of Innovators and Non-Innovators
Authors: Vaishali P. Joshi, Pallav Joshi
Abstract:
The objective of this study is to examine the differences fashion innovators and non-fashion innovators in their fashion consumption behavior in terms of their pre-purchase behavior, purchase behavior and post purchase behavior. The questionnaire was distributed to a female college student for data collection for achieving the objective of the first part of the study. Question-related to fashion innovativeness and fashion consumption behavior was asked. The sample was comprised of 81 college females ages 18 through 30 who were attending Business Management degree. A series of attitude questions was used to categorize respondents on the Innovativeness Scale. 32 respondents with a score of 21 and above were designated as Fashion innovators and the remainder (49) as Non-fashion innovators. Findings showed that there exist significant differences between innovators and non-innovators in their fashion consumption behavior. Data was analyzed through frequency distribution table. Many differences were found in the behavior of innovators and non-innovators in terms of their pre-purchase, actual purchase, and post-purchase behavior.Keywords: fashion, innovativeness, consumption behavior, purchase
Procedia PDF Downloads 56020963 Parallel Genetic Algorithms Clustering for Handling Recruitment Problem
Authors: Walid Moudani, Ahmad Shahin
Abstract:
This research presents a study to handle the recruitment services system. It aims to enhance a business intelligence system by embedding data mining in its core engine and to facilitate the link between job searchers and recruiters companies. The purpose of this study is to present an intelligent management system for supporting recruitment services based on data mining methods. It consists to apply segmentation on the extracted job postings offered by the different recruiters. The details of the job postings are associated to a set of relevant features that are extracted from the web and which are based on critical criterion in order to define consistent clusters. Thereafter, we assign the job searchers to the best cluster while providing a ranking according to the job postings of the selected cluster. The performance of the proposed model used is analyzed, based on a real case study, with the clustered job postings dataset and classified job searchers dataset by using some metrics.Keywords: job postings, job searchers, clustering, genetic algorithms, business intelligence
Procedia PDF Downloads 32920962 Looking for a Connection between Oceanic Regions with Trends in Evaporation with Continental Ones with Trends in Precipitation through a Lagrangian Approach
Authors: Raquel Nieto, Marta Vázquez, Anita Drumond, Luis Gimeno
Abstract:
One of the hot spots of climate change is the increment of ocean evaporation. The best estimation of evaporation, OAFlux data, shows strong increasing trends in evaporation from the oceans since 1978, with peaks during the hemispheric winter and strongest along the paths of the global western boundary currents and any inner Seas. The transport of moisture from oceanic sources to the continents is the connection between evaporation from the ocean and precipitation over the continents. A key question is to try to relate evaporative source regions over the oceans where trends have occurred in the last decades with their sinks over the continents to check if there have been also any trends in the precipitation amount or its characteristics. A Lagrangian approach based on FLEXPART and ERA-interim data is used to establish this connection. The analyzed period was 1980 to 2012. Results show that there is not a general pattern, but a significant agreement was found in important areas of climate interest.Keywords: ocean evaporation, Lagrangian approaches, contiental precipitation, Europe
Procedia PDF Downloads 25620961 Tools for Analysis and Optimization of Standalone Green Microgrids
Authors: William Anderson, Kyle Kobold, Oleg Yakimenko
Abstract:
Green microgrids using mostly renewable energy (RE) for generation, are complex systems with inherent nonlinear dynamics. Among a variety of different optimization tools there are only a few ones that adequately consider this complexity. This paper evaluates applicability of two somewhat similar optimization tools tailored for standalone RE microgrids and also assesses a machine learning tool for performance prediction that can enhance the reliability of any chosen optimization tool. It shows that one of these microgrid optimization tools has certain advantages over another and presents a detailed routine of preparing input data to simulate RE microgrid behavior. The paper also shows how neural-network-based predictive modeling can be used to validate and forecast solar power generation based on weather time series data, which improves the overall quality of standalone RE microgrid analysis.Keywords: microgrid, renewable energy, complex systems, optimization, predictive modeling, neural networks
Procedia PDF Downloads 28220960 A Wall Law for Two-Phase Turbulent Boundary Layers
Authors: Dhahri Maher, Aouinet Hana
Abstract:
The presence of bubbles in the boundary layer introduces corrections into the log law, which must be taken into account. In this work, a logarithmic wall law was presented for bubbly two phase flows. The wall law presented in this work was based on the postulation of additional turbulent viscosity associated with bubble wakes in the boundary layer. The presented wall law contained empirical constant accounting both for shear induced turbulence interaction and for non-linearity of bubble. This constant was deduced from experimental data. The wall friction prediction achieved with the wall law was compared to the experimental data, in the case of a turbulent boundary layer developing on a vertical flat plate in the presence of millimetric bubbles. A very good agreement between experimental and numerical wall friction prediction was verified. The agreement was especially noticeable for the low void fraction when bubble induced turbulence plays a significant role.Keywords: bubbly flows, log law, boundary layer, CFD
Procedia PDF Downloads 27820959 Secure Intelligent Information Management by Using a Framework of Virtual Phones-On Cloud Computation
Authors: Mohammad Hadi Khorashadi Zadeh
Abstract:
Many new applications and internet services have been emerged since the innovation of mobile networks and devices. However, these applications have problems of security, management, and performance in business environments. Cloud systems provide information transfer, management facilities, and security for virtual environments. Therefore, an innovative internet service and a business model are proposed in the present study for creating a secure and consolidated environment for managing the mobile information of organizations based on cloud virtual phones (CVP) infrastructures. Using this method, users can run Android and web applications in the cloud which enhance performance by connecting to other CVP users and increases privacy. It is possible to combine the CVP with distributed protocols and central control which mimics the behavior of human societies. This mix helps in dealing with sensitive data in mobile devices and facilitates data management with less application overhead.Keywords: BYOD, mobile cloud computing, mobile security, information management
Procedia PDF Downloads 31720958 Recurrent Neural Networks for Classifying Outliers in Electronic Health Record Clinical Text
Authors: Duncan Wallace, M-Tahar Kechadi
Abstract:
In recent years, Machine Learning (ML) approaches have been successfully applied to an analysis of patient symptom data in the context of disease diagnosis, at least where such data is well codified. However, much of the data present in Electronic Health Records (EHR) are unlikely to prove suitable for classic ML approaches. Furthermore, as scores of data are widely spread across both hospitals and individuals, a decentralized, computationally scalable methodology is a priority. The focus of this paper is to develop a method to predict outliers in an out-of-hours healthcare provision center (OOHC). In particular, our research is based upon the early identification of patients who have underlying conditions which will cause them to repeatedly require medical attention. OOHC act as an ad-hoc delivery of triage and treatment, where interactions occur without recourse to a full medical history of the patient in question. Medical histories, relating to patients contacting an OOHC, may reside in several distinct EHR systems in multiple hospitals or surgeries, which are unavailable to the OOHC in question. As such, although a local solution is optimal for this problem, it follows that the data under investigation is incomplete, heterogeneous, and comprised mostly of noisy textual notes compiled during routine OOHC activities. Through the use of Deep Learning methodologies, the aim of this paper is to provide the means to identify patient cases, upon initial contact, which are likely to relate to such outliers. To this end, we compare the performance of Long Short-Term Memory, Gated Recurrent Units, and combinations of both with Convolutional Neural Networks. A further aim of this paper is to elucidate the discovery of such outliers by examining the exact terms which provide a strong indication of positive and negative case entries. While free-text is the principal data extracted from EHRs for classification, EHRs also contain normalized features. Although the specific demographical features treated within our corpus are relatively limited in scope, we examine whether it is beneficial to include such features among the inputs to our neural network, or whether these features are more successfully exploited in conjunction with a different form of a classifier. In this section, we compare the performance of randomly generated regression trees and support vector machines and determine the extent to which our classification program can be improved upon by using either of these machine learning approaches in conjunction with the output of our Recurrent Neural Network application. The output of our neural network is also used to help determine the most significant lexemes present within the corpus for determining high-risk patients. By combining the confidence of our classification program in relation to lexemes within true positive and true negative cases, with an inverse document frequency of the lexemes related to these cases, we can determine what features act as the primary indicators of frequent-attender and non-frequent-attender cases, providing a human interpretable appreciation of how our program classifies cases.Keywords: artificial neural networks, data-mining, machine learning, medical informatics
Procedia PDF Downloads 13120957 Improving Road Infrastructure Safety Management Through Statistical Analysis of Road Accident Data. Case Study: Streets in Bucharest
Authors: Dimitriu Corneliu-Ioan, Gheorghe FrațIlă
Abstract:
Romania has one of the highest rates of road deaths among European Union Member States, and there is a concern that the country will not meet its goal of "zero deaths" by 2050. The European Union also aims to halve the number of people seriously injured in road accidents by 2030. Therefore, there is a need to improve road infrastructure safety management in Romania. The aim of this study is to analyze road accident data through statistical methods to assess the current state of road infrastructure safety in Bucharest. The study also aims to identify trends and make forecasts regarding serious road accidents and their consequences. The objective is to provide insights that can help prioritize measures to increase road safety, particularly in urban areas. The research utilizes statistical analysis methods, including exploratory analysis and descriptive statistics. Databases from the Traffic Police and the Romanian Road Authority are analyzed using Excel. Road risks are compared with the main causes of road accidents to identify correlations. The study emphasizes the need for better quality and more diverse collection of road accident data for effective analysis in the field of road infrastructure engineering. The research findings highlight the importance of prioritizing measures to improve road safety in urban areas, where serious accidents and their consequences are more frequent. There is a correlation between the measures ordered by road safety auditors and the main causes of serious accidents in Bucharest. The study also reveals the significant social costs of road accidents, amounting to approximately 3% of GDP, emphasizing the need for collaboration between local and central administrations in allocating resources for road safety. This research contributes to a clearer understanding of the current road infrastructure safety situation in Romania. The findings provide critical insights that can aid decision-makers in allocating resources efficiently and institutionally cooperating to achieve sustainable road safety. The data used for this study are collected from the Traffic Police and the Romanian Road Authority. The data processing involves exploratory analysis and descriptive statistics using the Excel tool. The analysis allows for a better understanding of the factors contributing to the current road safety situation and helps inform managerial decisions to eliminate or reduce road risks. The study addresses the state of road infrastructure safety in Bucharest and analyzes the trends and forecasts regarding serious road accidents and their consequences. It studies the correlation between road safety measures and the main causes of serious accidents. To improve road safety, cooperation between local and central administrations towards joint financial efforts is important. This research highlights the need for statistical data processing methods to substantiate managerial decisions in road infrastructure management. It emphasizes the importance of improving the quality and diversity of road accident data collection. The research findings provide a critical perspective on the current road safety situation in Romania and offer insights to identify appropriate solutions to reduce the number of serious road accidents in the future.Keywords: road death rate, strategic objective, serious road accidents, road safety, statistical analysis
Procedia PDF Downloads 8420956 Rotterdam in Transition: A Design Case for a Low-Carbon Transport Node in Lombardijen
Authors: Halina Veloso e Zarate, Manuela Triggianese
Abstract:
The urban challenges posed by rapid population growth, climate adaptation, and sustainable living have compelled Dutch cities to reimagine their built environment and transportation systems. As a pivotal contributor to CO₂ emissions, the transportation sector in the Netherlands demands innovative solutions for transitioning to low-carbon mobility. This study investigates the potential of transit oriented development (TOD) as a strategy for achieving carbon reduction and sustainable urban transformation. Focusing on the Lombardijen station area in Rotterdam, which is targeted for significant densification, this paper presents a design-oriented exploration of a low-carbon transport node. By employing a research-by-design methodology, this study delves into multifaceted factors and scales, aiming to propose future scenarios for Lombardijen. Drawing from a synthesis of existing literature, applied research, and practical insights, a robust design framework emerges. To inform this framework, governmental data concerning the built environment and material embodied carbon are harnessed. However, the restricted access to crucial datasets, such as property ownership information from the cadastre and embodied carbon data from De Nationale Milieudatabase, underscores the need for improved data accessibility, especially during the concept design phase. The findings of this research contribute fundamental insights not only to the Lombardijen case but also to TOD studies across Rotterdam's 13 nodes and similar global contexts. Spatial data related to property ownership facilitated the identification of potential densification sites, underscoring its importance for informed urban design decisions. Additionally, the paper highlights the disparity between the essential role of embodied carbon data in environmental assessments for building permits and its limited accessibility due to proprietary barriers. Although this study lays the groundwork for sustainable urbanization through TOD-based design, it acknowledges an area of future research worthy of exploration: the socio-economic dimension. Given the complex socio-economic challenges inherent in the Lombardijen area, extending beyond spatial constraints, a comprehensive approach demands integration of mobility infrastructure expansion, land-use diversification, programmatic enhancements, and climate adaptation. While the paper adopts a TOD lens, it refrains from an in-depth examination of issues concerning equity and inclusivity, opening doors for subsequent research to address these aspects crucial for holistic urban development.Keywords: Rotterdam zuid, transport oriented development, carbon emissions, low-carbon design, cross-scale design, data-supported design
Procedia PDF Downloads 8420955 Information Exchange Process Analysis between Authoring Design Tools and Lighting Simulation Tools
Authors: Rudan Xue, Annika Moscati, Rehel Zeleke Kebede, Peter Johansson
Abstract:
Successful buildings’ simulation and analysis inevitably require information exchange between multiple building information modeling (BIM) software. The BIM infor-mation exchange based on IFC is widely used. However, Industry Foundation Classifi-cation (IFC) files are not always reliable and information can get lost when using dif-ferent software for modeling and simulations. In this research, interviews with lighting simulation experts and a case study provided by a company producing lighting devices have been the research methods used to identify the necessary steps and data for suc-cessful information exchange between lighting simulation tools and authoring design tools. Model creation, information exchange, and model simulation have been identi-fied as key aspects for the success of information exchange. The paper concludes with recommendations for improved information exchange and more reliable simulations that take all the needed parameters into consideration.Keywords: BIM, data exchange, interoperability issues, lighting simulations
Procedia PDF Downloads 23920954 Effect of Laser Ablation OTR Films and High Concentration Carbon Dioxide for Maintaining the Freshness of Strawberry ‘Maehyang’ for Export in Modified Atmosphere Condition
Authors: Hyuk Sung Yoon, In-Lee Choi, Min Jae Jeong, Jun Pill Baek, Ho-Min Kang
Abstract:
This study was conducted to improve storability by using suitable laser ablation oxygen transmission rate (OTR) films and effectiveness of high carbon dioxide at strawberry 'Maehyang' for export. Strawberries were grown by hydroponic system in Gyeongsangnam-do province. These strawberries were packed by different laser ablation OTR films (Daeryung Co., Ltd.) such as 1,300 cc, 20,000 cc, 40,000 cc, 80,000 cc, and 100,000 cc•m-2•day•atm. And CO2 injection (30%) treatment was used 20,000 cc•m-2•day•atm OTR film and perforated film was as a control. Temperature conditions were applied simulated shipping and distribution conditions from Korea to Singapore, there were stored at 3 ℃ (13 days), 10 ℃ (an hour), and 8 ℃ (7 days) for 20 days. Fresh weight loss rate was under 1% as maximum permissible weight loss in treated OTR films except perforated film as a control during storage. Carbon dioxide concentration within a package for the storage period showed a lower value than the maximum CO2 concentration tolerated range (15 %) in treated OTR films and even the concentration of high OTR film treatment; from 20,000cc to 100,000cc were less than 3%. 1,300 cc had a suitable carbon dioxide range as over 5 % under 15 % at 5 days after storage until finished experiments and CO2 injection treatment was quickly drop the 15 % at storage after 1 day, but it kept around 15 % during storage. Oxygen concentration was maintained between 10 to 15 % in 1,300 cc and CO2 injection treatments, but other treatments were kept in 19 to 21 %. Ethylene concentration was showed very higher concentration at the CO2 injection treatment than OTR treatments. In the OTR treatments, 1,300 cc showed the highest concentration in ethylene and 20,000 cc film had lowest. Firmness was maintained highest in 1,300cc, but there was not shown any significant differences among other OTR treatments. Visual quality had shown the best result in 20,000 cc that showed marketable quality until 20 days after storage. 20,000 cc and perforated film had better than other treatments in off-odor and the 1,300 cc and CO2 injection treatments have occurred strong off-odor even after 10 minutes. As a result of the difference between Hunter ‘L’ and ‘a’ values of chroma meter, the 1,300cc and CO2 injection treatments were delayed color developments and other treatments did not shown any significant differences. The results indicate that effectiveness for maintaining the freshness was best achieved at 20,000 cc•m-2•day•atm. Although 1,300 cc and CO2 injection treatments were in appropriate MA condition, it showed darkening of strawberry calyx and excessive reduction of coloring due to high carbon dioxide concentration during storage. While 1,300cc and CO2 injection treatments were considered as appropriate treatments for exports to Singapore, but the result was shown different. These results are based on cultivar characteristics of strawberry 'Maehyang'.Keywords: carbon dioxide, firmness, shelf-life, visual quality
Procedia PDF Downloads 39920953 Climate Change and Sustainable Development among Agricultural Communities in Tanzania; An Analysis of Southern Highland Rural Communities
Authors: Paschal Arsein Mugabe
Abstract:
This paper examines sustainable development planning in the context of environmental concerns in rural areas of the Tanzania. It challenges mainstream approaches to development, focusing instead upon transformative action for environmental justice. The goal is to help shape future sustainable development agendas in local government, international agencies and civil society organisations. Research methods: The approach of the study is geographical, but also involves various Trans-disciplinary elements, particularly from development studies, sociology and anthropology, management, geography, agriculture and environmental science. The research methods included thematic and questionnaire interviews, participatory tools such as focus group discussion, participatory research appraisal and expert interviews for primary data. Secondary data were gathered through the analysis of land use/cover data and official documents on climate, agriculture, marketing and health. Also several earlier studies that were made in the area provided an important reference base. Findings: The findings show that, agricultural sustainability in Tanzania appears likely to deteriorate as a consequence of climate change. Noteworthy differences in impacts across households are also present both by district and by income category. Also food security cannot be explained by climate as the only influencing factor. A combination of economic, political and socio-cultural context of the community are crucial. Conclusively, it is worthy knowing that people understand their relationship between climate change and their livelihood.Keywords: agriculture, climate change, environment, sustainable development
Procedia PDF Downloads 32520952 Gravity and Magnetic Survey, Modeling and Interpretation in the Blötberget Iron-Oxide Mining Area of Central Sweden
Authors: Ezra Yehuwalashet, Alireza Malehmir
Abstract:
Blötberget mining area in central Sweden, part of the Bergslagen mineral district, is well known for its various type of mineralization particularly iron-oxide deposits since the 1600. To shed lights on the knowledge of the host rock structures, depth extent and tonnage of the mineral deposits and support deep mineral exploration potential in the study area, new ground gravity and existing aeromagnetic data (from the Geological Survey of Sweden) were used for interpretations and modelling. A major boundary separating a gravity low from a gravity high in the southern part of the study area is noticeable and likely representing a fault boundary separating two different lithological units. Gravity data and modeling offers a possible new target area in the southeast of the known mineralization while suggesting an excess high-density region down to 800 m depth.Keywords: gravity, magnetics, ore deposit, geophysics
Procedia PDF Downloads 65