Search results for: panel data method
36661 Knowledge Graph Development to Connect Earth Metadata and Standard English Queries
Authors: Gabriel Montague, Max Vilgalys, Catherine H. Crawford, Jorge Ortiz, Dava Newman
Abstract:
There has never been so much publicly accessible atmospheric and environmental data. The possibilities of these data are exciting, but the sheer volume of available datasets represents a new challenge for researchers. The task of identifying and working with a new dataset has become more difficult with the amount and variety of available data. Datasets are often documented in ways that differ substantially from the common English used to describe the same topics. This presents a barrier not only for new scientists, but for researchers looking to find comparisons across multiple datasets or specialists from other disciplines hoping to collaborate. This paper proposes a method for addressing this obstacle: creating a knowledge graph to bridge the gap between everyday English language and the technical language surrounding these datasets. Knowledge graph generation is already a well-established field, although there are some unique challenges posed by working with Earth data. One is the sheer size of the databases – it would be infeasible to replicate or analyze all the data stored by an organization like The National Aeronautics and Space Administration (NASA) or the European Space Agency. Instead, this approach identifies topics from metadata available for datasets in NASA’s Earthdata database, which can then be used to directly request and access the raw data from NASA. By starting with a single metadata standard, this paper establishes an approach that can be generalized to different databases, but leaves the challenge of metadata harmonization for future work. Topics generated from the metadata are then linked to topics from a collection of English queries through a variety of standard and custom natural language processing (NLP) methods. The results from this method are then compared to a baseline of elastic search applied to the metadata. This comparison shows the benefits of the proposed knowledge graph system over existing methods, particularly in interpreting natural language queries and interpreting topics in metadata. For the research community, this work introduces an application of NLP to the ecological and environmental sciences, expanding the possibilities of how machine learning can be applied in this discipline. But perhaps more importantly, it establishes the foundation for a platform that can enable common English to access knowledge that previously required considerable effort and experience. By making this public data accessible to the full public, this work has the potential to transform environmental understanding, engagement, and action.Keywords: earth metadata, knowledge graphs, natural language processing, question-answer systems
Procedia PDF Downloads 14836660 State Estimation Method Based on Unscented Kalman Filter for Vehicle Nonlinear Dynamics
Authors: Wataru Nakamura, Tomoaki Hashimoto, Liang-Kuang Chen
Abstract:
This paper provides a state estimation method for automatic control systems of nonlinear vehicle dynamics. A nonlinear tire model is employed to represent the realistic behavior of a vehicle. In general, all the state variables of control systems are not precisedly known, because those variables are observed through output sensors and limited parts of them might be only measurable. Hence, automatic control systems must incorporate some type of state estimation. It is needed to establish a state estimation method for nonlinear vehicle dynamics with restricted measurable state variables. For this purpose, unscented Kalman filter method is applied in this study for estimating the state variables of nonlinear vehicle dynamics. The objective of this paper is to propose a state estimation method using unscented Kalman filter for nonlinear vehicle dynamics. The effectiveness of the proposed method is verified by numerical simulations.Keywords: state estimation, control systems, observer systems, nonlinear systems
Procedia PDF Downloads 13536659 Nostalgic Tourism in Macau: The Bidirectional Causal Relationship between Destination Image and Experiential Value
Authors: Aliana Leong, T. C. Huan
Abstract:
The purpose of Nostalgic themed tourism product is becoming popular in many countries. This study intends to investigate the role of nostalgia in destination image, experiential value and their effect on subsequent behavioral intention. The survey used stratified sampling method to include respondents from all the nearby Asian regions. The sampling is based on the data of inbound tourists provided by the Statistics and Census Service (DSEC) of government of Macau. The questionnaire consisted of five sections of 5 point Likert scale questions: (1) nostalgia, (2) destination image both before and after experience, (3) expected value, (4) experiential value, and (5) future visit intention. Data was analysed with structural equation modelling. The result indicates that nostalgia plays an important part in forming destination image and experiential value before individual had a chance to experience the destination. The destination image and experiential value share a bidirectional causal relationship that eventually contributes to future visit intention. The study also discovered that while experiential value is more effective in generating destination image, the later contribute more to future visit intention. The research design measures destination image and experiential value before and after respondents had experience the destination. The distinction between destination image and expected/experiential value can be examined because the longitudinal design of research method. It also allows this study to observe how nostalgia translates to future visit intention.Keywords: nostalgia, destination image, experiential value, future visit intention
Procedia PDF Downloads 39036658 Black-Box-Base Generic Perturbation Generation Method under Salient Graphs
Authors: Dingyang Hu, Dan Liu
Abstract:
DNN (Deep Neural Network) deep learning models are widely used in classification, prediction, and other task scenarios. To address the difficulties of generic adversarial perturbation generation for deep learning models under black-box conditions, a generic adversarial ingestion generation method based on a saliency map (CJsp) is proposed to obtain salient image regions by counting the factors that influence the input features of an image on the output results. This method can be understood as a saliency map attack algorithm to obtain false classification results by reducing the weights of salient feature points. Experiments also demonstrate that this method can obtain a high success rate of migration attacks and is a batch adversarial sample generation method.Keywords: adversarial sample, gradient, probability, black box
Procedia PDF Downloads 10436657 Hardware Implementation and Real-time Experimental Validation of a Direction of Arrival Estimation Algorithm
Authors: Nizar Tayem, AbuMuhammad Moinuddeen, Ahmed A. Hussain, Redha M. Radaydeh
Abstract:
This research paper introduces an approach for estimating the direction of arrival (DOA) of multiple RF noncoherent sources in a uniform linear array (ULA). The proposed method utilizes a Capon-like estimation algorithm and incorporates LU decomposition to enhance the accuracy of DOA estimation while significantly reducing computational complexity compared to existing methods like the Capon method. Notably, the proposed method does not require prior knowledge of the number of sources. To validate its effectiveness, the proposed method undergoes validation through both software simulations and practical experimentation on a prototype testbed constructed using a software-defined radio (SDR) platform and GNU Radio software. The results obtained from MATLAB simulations and real-time experiments provide compelling evidence of the proposed method's efficacy.Keywords: DOA estimation, real-time validation, software defined radio, computational complexity, Capon's method, GNU radio
Procedia PDF Downloads 7536656 On Constructing a Cubically Convergent Numerical Method for Multiple Roots
Authors: Young Hee Geum
Abstract:
We propose the numerical method defined by xn+1 = xn − λ[f(xn − μh(xn))/]f'(xn) , n ∈ N, and determine the control parameter λ and μ to converge cubically. In addition, we derive the asymptotic error constant. Applying this proposed scheme to various test functions, numerical results show a good agreement with the theory analyzed in this paper and are proven using Mathematica with its high-precision computability.Keywords: asymptotic error constant, iterative method, multiple root, root-finding
Procedia PDF Downloads 22036655 Data Quality as a Pillar of Data-Driven Organizations: Exploring the Benefits of Data Mesh
Authors: Marc Bachelet, Abhijit Kumar Chatterjee, José Manuel Avila
Abstract:
Data quality is a key component of any data-driven organization. Without data quality, organizations cannot effectively make data-driven decisions, which often leads to poor business performance. Therefore, it is important for an organization to ensure that the data they use is of high quality. This is where the concept of data mesh comes in. Data mesh is an organizational and architectural decentralized approach to data management that can help organizations improve the quality of data. The concept of data mesh was first introduced in 2020. Its purpose is to decentralize data ownership, making it easier for domain experts to manage the data. This can help organizations improve data quality by reducing the reliance on centralized data teams and allowing domain experts to take charge of their data. This paper intends to discuss how a set of elements, including data mesh, are tools capable of increasing data quality. One of the key benefits of data mesh is improved metadata management. In a traditional data architecture, metadata management is typically centralized, which can lead to data silos and poor data quality. With data mesh, metadata is managed in a decentralized manner, ensuring accurate and up-to-date metadata, thereby improving data quality. Another benefit of data mesh is the clarification of roles and responsibilities. In a traditional data architecture, data teams are responsible for managing all aspects of data, which can lead to confusion and ambiguity in responsibilities. With data mesh, domain experts are responsible for managing their own data, which can help provide clarity in roles and responsibilities and improve data quality. Additionally, data mesh can also contribute to a new form of organization that is more agile and adaptable. By decentralizing data ownership, organizations can respond more quickly to changes in their business environment, which in turn can help improve overall performance by allowing better insights into business as an effect of better reports and visualization tools. Monitoring and analytics are also important aspects of data quality. With data mesh, monitoring, and analytics are decentralized, allowing domain experts to monitor and analyze their own data. This will help in identifying and addressing data quality problems in quick time, leading to improved data quality. Data culture is another major aspect of data quality. With data mesh, domain experts are encouraged to take ownership of their data, which can help create a data-driven culture within the organization. This can lead to improved data quality and better business outcomes. Finally, the paper explores the contribution of AI in the coming years. AI can help enhance data quality by automating many data-related tasks, like data cleaning and data validation. By integrating AI into data mesh, organizations can further enhance the quality of their data. The concepts mentioned above are illustrated by AEKIDEN experience feedback. AEKIDEN is an international data-driven consultancy that has successfully implemented a data mesh approach. By sharing their experience, AEKIDEN can help other organizations understand the benefits and challenges of implementing data mesh and improving data quality.Keywords: data culture, data-driven organization, data mesh, data quality for business success
Procedia PDF Downloads 13536654 Secure Data Sharing of Electronic Health Records With Blockchain
Authors: Kenneth Harper
Abstract:
The secure sharing of Electronic Health Records (EHRs) is a critical challenge in modern healthcare, demanding solutions to enhance interoperability, privacy, and data integrity. Traditional standards like Health Information Exchange (HIE) and HL7 have made significant strides in facilitating data exchange between healthcare entities. However, these approaches rely on centralized architectures that are often vulnerable to data breaches, lack sufficient privacy measures, and have scalability issues. This paper proposes a framework for secure, decentralized sharing of EHRs using blockchain technology, cryptographic tokens, and Non-Fungible Tokens (NFTs). The blockchain's immutable ledger, decentralized control, and inherent security mechanisms are leveraged to improve transparency, accountability, and auditability in healthcare data exchanges. Furthermore, we introduce the concept of tokenizing patient data through NFTs, creating unique digital identifiers for each record, which allows for granular data access controls and proof of data ownership. These NFTs can also be employed to grant access to authorized parties, establishing a secure and transparent data sharing model that empowers both healthcare providers and patients. The proposed approach addresses common privacy concerns by employing privacy-preserving techniques such as zero-knowledge proofs (ZKPs) and homomorphic encryption to ensure that sensitive patient information can be shared without exposing the actual content of the data. This ensures compliance with regulations like HIPAA and GDPR. Additionally, the integration of Fast Healthcare Interoperability Resources (FHIR) with blockchain technology allows for enhanced interoperability, enabling healthcare organizations to exchange data seamlessly and securely across various systems while maintaining data governance and regulatory compliance. Through real-world case studies and simulations, this paper demonstrates how blockchain-based EHR sharing can reduce operational costs, improve patient outcomes, and enhance the security and privacy of healthcare data. This decentralized framework holds great potential for revolutionizing healthcare information exchange, providing a transparent, scalable, and secure method for managing patient data in a highly regulated environment.Keywords: blockchain, electronic health records (ehrs), fast healthcare interoperability resources (fhir), health information exchange (hie), hl7, interoperability, non-fungible tokens (nfts), privacy-preserving techniques, tokens, secure data sharing,
Procedia PDF Downloads 2136653 Small Micro and Medium Enterprises Perception-Based Framework to Access Financial Support
Authors: Melvin Mothoa
Abstract:
Small Micro and Medium Enterprises are very significant for the development of their market economies. They are the main creators of the new working places, and they present a vital core of the market economy in countries across the globe. Access to finance is identified as crucial for small, micro, and medium-sized enterprises for their growth and innovation. This paper is conceived to propose a perception-based SMME framework to aid in access to financial support. Furthermore, the study will address issues that impede SMMEs in South Africa from obtaining finance from financial institutions. The framework will be tested against data collected from 200 Small Micro & Medium Enterprises in the Gauteng province of South Africa. The study adopts a quantitative method, and the delivery of self-administered questionnaires to SMMEs will be the primary data collection tool. Structural equation modeling will be used to further analyse the data collected.Keywords: finance, small business, growth, development
Procedia PDF Downloads 11136652 Comparison of Wet and Microwave Digestion Methods for the Al, Cu, Fe, Mn, Ni, Pb and Zn Determination in Some Honey Samples by ICPOES in Turkey
Authors: Huseyin Altundag, Emel Bina, Esra Altıntıg
Abstract:
The aim of this study is determining amount of Al, Cu, Fe, Mn, Ni, Pb and Zn in the samples of honey which are gathered from Sakarya and Istanbul regions. In this study the evaluation of the trace elements in honeys samples are gathered from Sakarya and Istanbul, Turkey. The sample preparation phase is performed via wet decomposition method and microwave digestion system. The accuracy of the method was corrected by the standard reference material, Tea Leaves (INCY-TL-1) and NIST SRM 1515 Apple leaves. The comparison between gathered data and literature values has made and possible resources of the contamination to the samples of honey have handled. The obtained results will be presented in ICCIS 2015: XIII International Conference on Chemical Industry and Science.Keywords: Wet decomposition, Microwave digestion, Trace element, Honey, ICP-OES
Procedia PDF Downloads 46236651 An Investigation on the Sandwich Panels with Flexible and Toughened Adhesives under Flexural Loading
Authors: Emre Kara, Şura Karakuzu, Ahmet Fatih Geylan, Metehan Demir, Kadir Koç, Halil Aykul
Abstract:
The material selection in the design of the sandwich structures is very crucial aspect because of the positive or negative influences of the base materials to the mechanical properties of the entire panel. In the literature, it was presented that the selection of the skin and core materials plays very important role on the behavior of the sandwich. Beside this, the use of the correct adhesive can make the whole structure to show better mechanical results and behavior. By this way, the sandwich structures realized in the study were obtained with the combination of aluminum foam core and three different glass fiber reinforced polymer (GFRP) skins using two different commercial adhesives which are based on flexible polyurethane and toughened epoxy. The static and dynamic tests were already applied on the sandwiches with different types of adhesives. In the present work, the static three-point bending tests were performed on the sandwiches having an aluminum foam core with the thickness of 15 mm, the skins with three different types of fabrics ([0°/90°] cross ply E-Glass Biaxial stitched, [0°/90°] cross ply E-Glass Woven and [0°/90°] cross ply S-Glass Woven which have same thickness value of 1.75 mm) and two different commercial adhesives (flexible polyurethane and toughened epoxy based) at different values of support span distances (L= 55, 70, 80, 125 mm) by aiming the analyses of their flexural performance. The skins used in the study were produced via Vacuum Assisted Resin Transfer Molding (VARTM) technique and were easily bonded onto the aluminum foam core with flexible and toughened adhesives under a very low pressure using press machine with the alignment tabs having the total thickness of the whole panel. The main results of the flexural loading are: force-displacement curves obtained after the bending tests, peak force values, absorbed energy, collapse mechanisms, adhesion quality and the effect of the support span length and adhesive type. The experimental results presented that the sandwiches with epoxy based toughened adhesive and the skins made of S-Glass Woven fabrics indicated the best adhesion quality and mechanical properties. The sandwiches with toughened adhesive exhibited higher peak force and energy absorption values compared to the sandwiches with flexible adhesive. The core shear mode occurred in the sandwiches with flexible polyurethane based adhesive through the thickness of the core while the same mode took place in the sandwiches with toughened epoxy based adhesive along the length of the core. The use of these sandwich structures can lead to a weight reduction of the transport vehicles, providing an adequate structural strength under operating conditions.Keywords: adhesive and adhesion, aluminum foam, bending, collapse mechanisms
Procedia PDF Downloads 32836650 A Delphi Study to Build Consensus for Tuberculosis Control Guideline to Achieve Who End Tb 2035 Strategy
Authors: Pui Hong Chung, Cyrus Leung, Jun Li, Kin On Kwok, Ek Yeoh
Abstract:
Introduction: Studies for TB control in intermediate tuberculosis burden countries (IBCs) comprise a relatively small proportion in TB control literature, as compared to the effort put in high and low burden counterparts. It currently lacks of consensus in the optimal weapons and strategies we can use to combat TB in IBCs; guidelines of TB control are inadequate and thus posing a great obstacle in eliminating TB in these countries. To fill-in the research and services gap, we need to summarize the findings of the effort in this regard and to seek consensus in terms of policy making for TB control, we have devised a series of scoping and Delphi studies for these purposes. Method: The scoping and Delphi studies are conducted in parallel to feed information for each other. Before the Delphi iterations, we have invited three local experts in TB control in Hong Kong to participate in the pre-assessment round of the Delphi study to comments on the validity, relevance, and clarity of the Delphi questionnaire. Result: Two scoping studies, regarding LTBI control in health care workers in IBCs and TB control in elderly of IBCs respectively, have been conducted. The result of these two studies is used as the foundation for developing the Delphi questionnaire, which tapped on seven areas of question, namely: characteristics of IBCs, adequacy of research and services in LTBI control in IBCs, importance and feasibility of interventions for TB control and prevention in hospital, screening and treatment of LTBI in community, reasons of refusal to/ default from LTBI treatment, medical adherence of LTBI treatment, and importance and feasibility of interventions for TB control and prevention in elderly in IBCs. The local experts also commented on the two scoping studies conducted, thus act as the sixth phase of expert consultation in Arksey and O’Malley framework of scoping studies, to either nourish the scope and strategies used in these studies or to supplement ideas for further scoping or systematic review studies. In the subsequent stage, an international expert panel, comprised of 15 to 20 experts from IBCs in Western Pacific Region, will be recruited to join the two-round anonymous Delphi iterations. Four categories of TB control experts, namely clinicians, policy makers, microbiologists/ laboratory personnel, and public health clinicians will be our target groups. A consensus level of 80% is used to determine the achievement of consensus on particular issues. Key messages: 1. Scoping review and Delphi method are useful to identify gaps and then achieve consensus in research. 2. Lots of resources are put in the high burden countries now. However, the usually neglected intermediate-burden countries with TB is an indispensable part for achieving the ambitious WHO End TB 2035 target.Keywords: dephi questionnaire, tuberculosis, WHO, latent TB infection
Procedia PDF Downloads 30136649 A Visual Analytics Tool for the Structural Health Monitoring of an Aircraft Panel
Authors: F. M. Pisano, M. Ciminello
Abstract:
Aerospace, mechanical, and civil engineering infrastructures can take advantages from damage detection and identification strategies in terms of maintenance cost reduction and operational life improvements, as well for safety scopes. The challenge is to detect so called “barely visible impact damage” (BVID), due to low/medium energy impacts, that can progressively compromise the structure integrity. The occurrence of any local change in material properties, that can degrade the structure performance, is to be monitored using so called Structural Health Monitoring (SHM) systems, in charge of comparing the structure states before and after damage occurs. SHM seeks for any "anomalous" response collected by means of sensor networks and then analyzed using appropriate algorithms. Independently of the specific analysis approach adopted for structural damage detection and localization, textual reports, tables and graphs describing possible outlier coordinates and damage severity are usually provided as artifacts to be elaborated for information extraction about the current health conditions of the structure under investigation. Visual Analytics can support the processing of monitored measurements offering data navigation and exploration tools leveraging the native human capabilities of understanding images faster than texts and tables. Herein, a SHM system enrichment by integration of a Visual Analytics component is investigated. Analytical dashboards have been created by combining worksheets, so that a useful Visual Analytics tool is provided to structural analysts for exploring the structure health conditions examined by a Principal Component Analysis based algorithm.Keywords: interactive dashboards, optical fibers, structural health monitoring, visual analytics
Procedia PDF Downloads 12436648 Selection of Suitable Reference Genes for Assessing Endurance Related Traits in a Native Pony Breed of Zanskar at High Altitude
Authors: Prince Vivek, Vijay K. Bharti, Manishi Mukesh, Ankita Sharma, Om Prakash Chaurasia, Bhuvnesh Kumar
Abstract:
High performance of endurance in equid requires adaptive changes involving physio-biochemical, and molecular responses in an attempt to regain homeostasis. We hypothesized that the identification of the suitable reference genes might be considered for assessing of endurance related traits in pony at high altitude and may ensure for individuals struggling to potent endurance trait in ponies at high altitude. A total of 12 mares of ponies, Zanskar breed, were divided into three groups, group-A (without load), group-B, (60 Kg) and group-C (80 Kg) on backpack loads were subjected to a load carry protocol, on a steep climb of 4 km uphill, and of gravel, uneven rocky surface track at an altitude of 3292 m to 3500 m (endpoint). Blood was collected before and immediately after the load carry on sodium heparin anticoagulant, and the peripheral blood mononuclear cell was separated for total RNA isolation and thereafter cDNA synthesis. Real time-PCR reactions were carried out to evaluate the mRNAs expression profile of a panel of putative internal control genes (ICGs), related to different functional classes, namely glyceraldehyde 3-phosphate dehydrogenase (GAPDH), β₂ microglobulin (β₂M), β-actin (ACTB), ribosomal protein 18 (RS18), hypoxanthine-guanine phosophoribosyltransferase (HPRT), ubiquitin B (UBB), ribosomal protein L32 (RPL32), transferrin receptor protein (TFRC), succinate dehydrogenase complex subunit A (SDHA) for normalizing the real-time quantitative polymerase chain reaction (qPCR) data of native pony’s. Three different algorithms, geNorm, NormFinder, and BestKeeper software, were used to evaluate the stability of reference genes. The result showed that GAPDH was best stable gene and stability value for the best combination of two genes was observed TFRC and β₂M. In conclusion, the geometric mean of GAPDH, TFRC and β₂M might be used for accurate normalization of transcriptional data for assessing endurance related traits in Zanskar ponies during load carrying.Keywords: endurance exercise, ubiquitin B (UBB), β₂ microglobulin (β₂M), high altitude, Zanskar ponies, reference gene
Procedia PDF Downloads 13136647 The Relation between Learning Styles and English Achievement in the Language Training Centre
Authors: Nurul Yusnita
Abstract:
Many studies have been developed to help the students to get good achievement in English learning. They can be from the teaching method or psychological ones. One of the psychological studies in educational research is learning style. In some ways, learning style can affect the achievement of the students. This study aimed to examine 4 (four) learning styles and their relations to English achievement among the students learning English in Language Training Center of Universitas Muhammadiyah Yogyakarta (LTC UMY). The method of this study was descriptive analytical. The sample consisted of 39 Accounting students in LTC UMY. The data was collected through questionnaires with Likert-scale. The achievement was obtained from the grade of the students. To analyze the questionnaires and to see the relation between the learning styles and the student achievement, SPSS statistical software of correlational analysis was used. The result showed that both visual and auditory had the same percentage of 35.9% (14 students). 3 students (7.7%) had kinaesthetic learning style and 8 students (20.5%) had visual and auditory ones. Meanwhile, there were 5 students (12.8%) who had visual learning style could increase their grades. Only 1 student (2.5%) who had visual and auditory could improve his grade. Besides grade increase, there were also grade decrease. Students with visual, auditory, visual and auditory, and kinaesthetic learning styles were 3 students (7.7%), 5 students (12%), 4 students (10.2%) and 1 student (2.5%) respectively. In conclusion, there was no significant relationship between learning style and English achievement. Most of the good achievers were the students with visual and auditory learning styles and most of them preferred visual method. The implication is the teachers and material designers could improve their method through visual things to achieve effective English teaching learning.Keywords: accounting students, English achievement, language training centre, learning styles
Procedia PDF Downloads 27136646 Application of a SubIval Numerical Solver for Fractional Circuits
Authors: Marcin Sowa
Abstract:
The paper discusses the subinterval-based numerical method for fractional derivative computations. It is now referred to by its acronym – SubIval. The basis of the method is briefly recalled. The ability of the method to be applied in time stepping solvers is discussed. The possibility of implementing a time step size adaptive solver is also mentioned. The solver is tested on a transient circuit example. In order to display the accuracy of the solver – the results have been compared with those obtained by means of a semi-analytical method called gcdAlpha. The time step size adaptive solver applying SubIval has been proven to be very accurate as the results are very close to the referential solution. The solver is currently able to solve FDE (fractional differential equations) with various derivative orders for each equation and any type of source time functions.Keywords: numerical method, SubIval, fractional calculus, numerical solver, circuit analysis
Procedia PDF Downloads 20536645 On a Continuous Formulation of Block Method for Solving First Order Ordinary Differential Equations (ODEs)
Authors: A. M. Sagir
Abstract:
The aim of this paper is to investigate the performance of the developed linear multistep block method for solving first order initial value problem of Ordinary Differential Equations (ODEs). The method calculates the numerical solution at three points simultaneously and produces three new equally spaced solution values within a block. The continuous formulations enable us to differentiate and evaluate at some selected points to obtain three discrete schemes, which were used in block form for parallel or sequential solutions of the problems. A stability analysis and efficiency of the block method are tested on ordinary differential equations involving practical applications, and the results obtained compared favorably with the exact solution. Furthermore, comparison of error analysis has been developed with the help of computer software.Keywords: block method, first order ordinary differential equations, linear multistep, self-starting
Procedia PDF Downloads 30636644 A Hybrid Model of Goal, Integer and Constraint Programming for Single Machine Scheduling Problem with Sequence Dependent Setup Times: A Case Study in Aerospace Industry
Authors: Didem Can
Abstract:
Scheduling problems are one of the most fundamental issues of production systems. Many different approaches and models have been developed according to the production processes of the parts and the main purpose of the problem. In this study, one of the bottleneck stations of a company serving in the aerospace industry is analyzed and considered as a single machine scheduling problem with sequence-dependent setup times. The objective of the problem is assigning a large number of similar parts to the same shift -to reduce chemical waste- while minimizing the number of tardy jobs. The goal programming method will be used to achieve two different objectives simultaneously. The assignment of parts to the shift will be expressed using the integer programming method. Finally, the constraint programming method will be used as it provides a way to find a result in a short time by avoiding worse resulting feasible solutions with the defined variables set. The model to be established will be tested and evaluated with real data in the application part.Keywords: constraint programming, goal programming, integer programming, sequence-dependent setup, single machine scheduling
Procedia PDF Downloads 23736643 On the Approximate Solution of Continuous Coefficients for Solving Third Order Ordinary Differential Equations
Authors: A. M. Sagir
Abstract:
This paper derived four newly schemes which are combined in order to form an accurate and efficient block method for parallel or sequential solution of third order ordinary differential equations of the form y^'''= f(x,y,y^',y^'' ), y(α)=y_0,〖y〗^' (α)=β,y^('' ) (α)=μ with associated initial or boundary conditions. The implementation strategies of the derived method have shown that the block method is found to be consistent, zero stable and hence convergent. The derived schemes were tested on stiff and non-stiff ordinary differential equations, and the numerical results obtained compared favorably with the exact solution.Keywords: block method, hybrid, linear multistep, self-starting, third order ordinary differential equations
Procedia PDF Downloads 27136642 Big Data Analysis with RHadoop
Authors: Ji Eun Shin, Byung Ho Jung, Dong Hoon Lim
Abstract:
It is almost impossible to store or analyze big data increasing exponentially with traditional technologies. Hadoop is a new technology to make that possible. R programming language is by far the most popular statistical tool for big data analysis based on distributed processing with Hadoop technology. With RHadoop that integrates R and Hadoop environment, we implemented parallel multiple regression analysis with different sizes of actual data. Experimental results showed our RHadoop system was much faster as the number of data nodes increases. We also compared the performance of our RHadoop with lm function and big lm packages available on big memory. The results showed that our RHadoop was faster than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases.Keywords: big data, Hadoop, parallel regression analysis, R, RHadoop
Procedia PDF Downloads 43736641 Efficient Positioning of Data Aggregation Point for Wireless Sensor Network
Authors: Sifat Rahman Ahona, Rifat Tasnim, Naima Hassan
Abstract:
Data aggregation is a helpful technique for reducing the data communication overhead in wireless sensor network. One of the important tasks of data aggregation is positioning of the aggregator points. There are a lot of works done on data aggregation. But, efficient positioning of the aggregators points is not focused so much. In this paper, authors are focusing on the positioning or the placement of the aggregation points in wireless sensor network. Authors proposed an algorithm to select the aggregators positions for a scenario where aggregator nodes are more powerful than sensor nodes.Keywords: aggregation point, data communication, data aggregation, wireless sensor network
Procedia PDF Downloads 15736640 An Implicit Methodology for the Numerical Modeling of Locally Inextensible Membranes
Authors: Aymen Laadhari
Abstract:
We present in this paper a fully implicit finite element method tailored for the numerical modeling of inextensible fluidic membranes in a surrounding Newtonian fluid. We consider a highly simplified version of the Canham-Helfrich model for phospholipid membranes, in which the bending force and spontaneous curvature are disregarded. The coupled problem is formulated in a fully Eulerian framework and the membrane motion is tracked using the level set method. The resulting nonlinear problem is solved by a Newton-Raphson strategy, featuring a quadratic convergence behavior. A monolithic solver is implemented, and we report several numerical experiments aimed at model validation and illustrating the accuracy of the proposed method. We show that stability is maintained for significantly larger time steps with respect to an explicit decoupling method.Keywords: finite element method, level set, Newton, membrane
Procedia PDF Downloads 33036639 Analysis of Bored Piles with and without Geogrid in a Selected Area in Kocaeli/Turkey
Authors: Utkan Mutman, Cihan Dirlik
Abstract:
Kocaeli/TURKEY district in which wastewater held in a chosen field increased property has made piling in order to improve the ground under the aeration basin. In this study, the degree of improvement the ground after bored piling held in the field were investigated. In this context, improving the ground before and after the investigation was carried out and that the solution values obtained by the finite element method analysis using Plaxis program have been made. The diffuses in the aeration basin whose treatment is to aide is influenced with and without geogrid on the ground. On the ground been improved, for the purpose of control of manufactured bored piles, pile continuity, and pile load tests were made. Taking into consideration both the data in the field as well as dynamic loads in the aeration basic, an analysis was made on Plaxis program and compared the data obtained from the analysis result and data obtained in the field.Keywords: geogrid, bored pile, soil improvement, plaxis
Procedia PDF Downloads 26836638 Nonparametric Truncated Spline Regression Model on the Data of Human Development Index in Indonesia
Authors: Kornelius Ronald Demu, Dewi Retno Sari Saputro, Purnami Widyaningsih
Abstract:
Human Development Index (HDI) is a standard measurement for a country's human development. Several factors may have influenced it, such as life expectancy, gross domestic product (GDP) based on the province's annual expenditure, the number of poor people, and the percentage of an illiterate people. The scatter plot between HDI and the influenced factors show that the plot does not follow a specific pattern or form. Therefore, the HDI's data in Indonesia can be applied with a nonparametric regression model. The estimation of the regression curve in the nonparametric regression model is flexible because it follows the shape of the data pattern. One of the nonparametric regression's method is a truncated spline. Truncated spline regression is one of the nonparametric approach, which is a modification of the segmented polynomial functions. The estimator of a truncated spline regression model was affected by the selection of the optimal knots point. Knot points is a focus point of spline truncated functions. The optimal knots point was determined by the minimum value of generalized cross validation (GCV). In this article were applied the data of Human Development Index with a truncated spline nonparametric regression model. The results of this research were obtained the best-truncated spline regression model to the HDI's data in Indonesia with the combination of optimal knots point 5-5-5-4. Life expectancy and the percentage of an illiterate people were the significant factors depend to the HDI in Indonesia. The coefficient of determination is 94.54%. This means the regression model is good enough to applied on the data of HDI in Indonesia.Keywords: generalized cross validation (GCV), Human Development Index (HDI), knots point, nonparametric regression, truncated spline
Procedia PDF Downloads 33936637 Studying the Schema of Afghan Immigrants about Iranians; A Case Study of Immigrants in Tehran Province
Authors: Mohammad Ayobi
Abstract:
Afghans have been immigrating to Iran for many years; The re-establishment of the Taliban in Afghanistan caused a flood of Afghan immigrants to Iran. One of the important issues related to the arrival of Afghan immigrants is the view that Afghan immigrants have toward Iranians. In this research, we seek to identify the schema of Afghan immigrants living in Iran about Iranians. A schema is a set of data or generalized knowledge that is formed in connection with a particular group or a particular person, or even a particular nationality to identify a person with pre-determined judgments about certain matters. The schemata between certain nationalities have a direct impact on the formation of interactions between them and can be effective in establishing or not establishing proper communication between the Afghan immigrant nationality and Iranians. For the scientific understanding of research, we use the theory of “schemata.” The method of this study is qualitative, and its data will be collected through semi-structured deep interviews, and data will be analyzed by thematic analysis. The expected findings in this study are that the schemata of Afghan immigrants are more negative than Iranians because Iranians are self-centered and fanatical about Afghans, and Afghans are only workers to them.Keywords: schema study, Afghan immigrants, Iranians, in-depth interview
Procedia PDF Downloads 8636636 Spatial Econometric Approaches for Count Data: An Overview and New Directions
Authors: Paula Simões, Isabel Natário
Abstract:
This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.Keywords: spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data
Procedia PDF Downloads 59336635 A NoSQL Based Approach for Real-Time Managing of Robotics's Data
Authors: Gueidi Afef, Gharsellaoui Hamza, Ben Ahmed Samir
Abstract:
This paper deals with the secret of the continual progression data that new data management solutions have been emerged: The NoSQL databases. They crossed several areas like personalization, profile management, big data in real-time, content management, catalog, view of customers, mobile applications, internet of things, digital communication and fraud detection. Nowadays, these database management systems are increasing. These systems store data very well and with the trend of big data, a new challenge’s store demands new structures and methods for managing enterprise data. The new intelligent machine in the e-learning sector, thrives on more data, so smart machines can learn more and faster. The robotics are our use case to focus on our test. The implementation of NoSQL for Robotics wrestle all the data they acquire into usable form because with the ordinary type of robotics; we are facing very big limits to manage and find the exact information in real-time. Our original proposed approach was demonstrated by experimental studies and running example used as a use case.Keywords: NoSQL databases, database management systems, robotics, big data
Procedia PDF Downloads 35436634 Delisting Wave: Corporate Financial Distress, Institutional Investors Perception and Performance of South African Listed Firms
Authors: Adebiyi Sunday Adeyanju, Kola Benson Ajeigbe, Fortune Ganda
Abstract:
In the past three decades, there has been a notable increase in the number of firms delisting from the Johannesburg Stock Exchange (JSE) in South Africa. The recent increasing rate of delisting waves of corporate listed firms motivated this study. This study aims to explore the influence of institutional investor perceptions on the financial distress experienced by delisted firms within the South African market. The study further examined the impact of financial distress on the corporate performance of delisted firms. Using the data of delisted firms spanning from 2000 to 2023 and the FGLS (Feasible Generalized Least Squares) for the short run and PCSE (Panel-Corrected Standard Errors) for the long run effects of the relationship. The finding indicated that a decline in institutional investors’ perceptions was associated with the corporate financial distress of the delisted firms, particularly during the delisting year and the few years preceding the announcement of the delisting. This study addressed the importance of investor recognition in corporate financial distress and the delisting wave among listed firms- a finding supporting the stakeholder theory. This study is an insight for companies’ managements, investors, governments, policymakers, stockbrokers, lending institutions, bankers, the stock market, and other stakeholders in their various decision-making endeavours. Based on the above findings, it was recommended that corporate managements should improve their governance strategies that can help companies’ financial performances. Accountability and transparency through governance must also be improved upon with government support through the introduction of policies and strategies and enabling an easy environment that can help companies perform better.Keywords: delisting wave, institutional investors, financial distress, corporate performance, investors’ perceptions
Procedia PDF Downloads 4536633 The Benefit of a Universal Screening Program for Lipid Disorders in Two to Ten Years Old Lebanese Children
Authors: Nicolas Georges, Akiki Simon, Bassil Naim, Nawfal Georges, Abi Fares Georges
Abstract:
Introduction: Dyslipidemia has been recognized as a risk factor for cardiovascular diseases. While the development of atherosclerotic lesions begins in childhood and progresses throughout life, data on the prevalence of dyslipidemic children in Lebanon is lacking. Objectives: This study was conducted to assess the benefit of a protocol for universal screening for lipid disorder in Lebanese children aged between two and ten years old. Materials and Methods: A total of four hundred children aged 2 to 10 years old (51.5% boys) were included in the study. The subjects were recruited from private pediatric clinics after parental consent. Fasting total cholesterol (TC), triglycerides (TG), low-density lipoprotein (LDL), high-density lipoprotein (HDL) levels were measured and non-HDL cholesterol was calculated. The values were categorized according to 2011 Expert on Integrated Guidelines for Cardiovascular Health and Risk Reduction in Children and Adolescents. Results: The overall prevalence of high TC ( ≥ 200 mg/dL), high non-HDL-C ( ≥ 145 mg/dL), high LDL ( ≥ 130 mg/dL), high TG ( ≥ 100 mg/dL) and low HDL ( < 40 mg/dL) was respectively 19.5%, 23%, 19%, 31.8% and 20%. The overall frequency of dyslipidemia was 51.7%. In a bivariate analysis, dyslipidemia in children was associated with a BMI ≥ 95ᵗʰ percentile and parents having TC > 240 mg/dL with a P value respectively of 0.006 and 0.0001. Furthermore, high TG was independently associated with a BMI ≥ 95ᵗʰ percentile (P=0.0001). Children with parents having TC > 240 mg/dL was significantly correlated with high TC, high non-HDL-C and high LDL (P=0.0001 for all variables). Finally, according to the Pediatric dyslipidemia screening guidelines from the 2011 Expert Panel, 62.3% of dyslipidemic children had at least 1 risk factor that qualified them for screening while 37.7% of them didn’t have any risk factor. Conclusions: It is preferable to review the latest pediatric dyslipidemia screening guidelines by performing a universal screening program since a third of our dyslipidemic Lebanese children have been missed.Keywords: cardiovascular risk factors, dyslipidemia, Lebanese children, screening
Procedia PDF Downloads 23136632 The Effect of CPU Location in Total Immersion of Microelectronics
Authors: A. Almaneea, N. Kapur, J. L. Summers, H. M. Thompson
Abstract:
Meeting the growth in demand for digital services such as social media, telecommunications, and business and cloud services requires large scale data centres, which has led to an increase in their end use energy demand. Generally, over 30% of data centre power is consumed by the necessary cooling overhead. Thus energy can be reduced by improving the cooling efficiency. Air and liquid can both be used as cooling media for the data centre. Traditional data centre cooling systems use air, however liquid is recognised as a promising method that can handle the more densely packed data centres. Liquid cooling can be classified into three methods; rack heat exchanger, on-chip heat exchanger and full immersion of the microelectronics. This study quantifies the improvements of heat transfer specifically for the case of immersed microelectronics by varying the CPU and heat sink location. Immersion of the server is achieved by filling the gap between the microelectronics and a water jacket with a dielectric liquid which convects the heat from the CPU to the water jacket on the opposite side. Heat transfer is governed by two physical mechanisms, which is natural convection for the fixed enclosure filled with dielectric liquid and forced convection for the water that is pumped through the water jacket. The model in this study is validated with published numerical and experimental work and shows good agreement with previous work. The results show that the heat transfer performance and Nusselt number (Nu) is improved by 89% by placing the CPU and heat sink on the bottom of the microelectronics enclosure.Keywords: CPU location, data centre cooling, heat sink in enclosures, immersed microelectronics, turbulent natural convection in enclosures
Procedia PDF Downloads 272