Search results for: electronic data interchange
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26313

Search results for: electronic data interchange

25113 Enhancing National Integrity through Teaching Secular Perspectives in Medieval Indian History Curricula: A Secular Paradigms

Authors: Deepak Deshpande, Vikas Minchekar

Abstract:

Day by day in modern India communal forces became stronger and stronger. Each and every caste group trying to show their strength through massive marches. Such kind of marches or ralliesruinous national integrity in India. To test this assumption present investigation has been carried out. This research was undertaken by using survey techniques. The study has been carried out in two phases. In the first stage, the students’ attitudes were collected while in the second phase the views of the members of the historical association were collected. The social dominance orientation scale and sources of social dominance inventory have been administered on 200 college students belonging to Maratha caste. Analyzed data revealed ahigh level of social dominance in Maratha caste students. Approximately, 80 percent students have reported that they have learned such dominance from the medieval history. The other sources disappear very less prominent. These results and present Indian social situation have been communicated with the members of the historical association of India. The majority members of this association agreed with this reality. The consensus also received on that Maratha caste person experiencing dominance due to the misinterpretation of the King Shivaji Empire; synchronize by politicians. The survey monkey app was used through electronic mail to collect the views on ‘The attitude towards the modification of curricula questionnaire’. The maximum number of members of the historical association agreed to employ to teach the medieval Indian history accordingly the secular perspectives.

Keywords: social dominance orientation, secular perceptive, national integrity, Maratha caste and medieval Indian history

Procedia PDF Downloads 259
25112 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R

Authors: Pavel H. Llamocca, Victoria Lopez

Abstract:

The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.

Keywords: open data, R language, data integration, environmental data

Procedia PDF Downloads 315
25111 Transforming Data into Knowledge: Mathematical and Statistical Innovations in Data Analytics

Authors: Zahid Ullah, Atlas Khan

Abstract:

The rapid growth of data in various domains has created a pressing need for effective methods to transform this data into meaningful knowledge. In this era of big data, mathematical and statistical innovations play a crucial role in unlocking insights and facilitating informed decision-making in data analytics. This abstract aims to explore the transformative potential of these innovations and their impact on converting raw data into actionable knowledge. Drawing upon a comprehensive review of existing literature, this research investigates the cutting-edge mathematical and statistical techniques that enable the conversion of data into knowledge. By evaluating their underlying principles, strengths, and limitations, we aim to identify the most promising innovations in data analytics. To demonstrate the practical applications of these innovations, real-world datasets will be utilized through case studies or simulations. This empirical approach will showcase how mathematical and statistical innovations can extract patterns, trends, and insights from complex data, enabling evidence-based decision-making across diverse domains. Furthermore, a comparative analysis will be conducted to assess the performance, scalability, interpretability, and adaptability of different innovations. By benchmarking against established techniques, we aim to validate the effectiveness and superiority of the proposed mathematical and statistical innovations in data analytics. Ethical considerations surrounding data analytics, such as privacy, security, bias, and fairness, will be addressed throughout the research. Guidelines and best practices will be developed to ensure the responsible and ethical use of mathematical and statistical innovations in data analytics. The expected contributions of this research include advancements in mathematical and statistical sciences, improved data analysis techniques, enhanced decision-making processes, and practical implications for industries and policymakers. The outcomes will guide the adoption and implementation of mathematical and statistical innovations, empowering stakeholders to transform data into actionable knowledge and drive meaningful outcomes.

Keywords: data analytics, mathematical innovations, knowledge extraction, decision-making

Procedia PDF Downloads 75
25110 FCNN-MR: A Parallel Instance Selection Method Based on Fast Condensed Nearest Neighbor Rule

Authors: Lu Si, Jie Yu, Shasha Li, Jun Ma, Lei Luo, Qingbo Wu, Yongqi Ma, Zhengji Liu

Abstract:

Instance selection (IS) technique is used to reduce the data size to improve the performance of data mining methods. Recently, to process very large data set, several proposed methods divide the training set into some disjoint subsets and apply IS algorithms independently to each subset. In this paper, we analyze the limitation of these methods and give our viewpoint about how to divide and conquer in IS procedure. Then, based on fast condensed nearest neighbor (FCNN) rule, we propose a large data sets instance selection method with MapReduce framework. Besides ensuring the prediction accuracy and reduction rate, it has two desirable properties: First, it reduces the work load in the aggregation node; Second and most important, it produces the same result with the sequential version, which other parallel methods cannot achieve. We evaluate the performance of FCNN-MR on one small data set and two large data sets. The experimental results show that it is effective and practical.

Keywords: instance selection, data reduction, MapReduce, kNN

Procedia PDF Downloads 253
25109 Occurrence of Half-Metallicity by Sb-Substitution in Non-Magnetic Fe₂TiSn

Authors: S. Chaudhuri, P. A. Bhobe

Abstract:

Fe₂TiSn is a non-magnetic full Heusler alloy with a small gap (~ 0.07 eV) at the Fermi level. The electronic structure is highly symmetric in both the spin bands and a small percentage of substitution of holes or electrons can push the system towards spin polarization. A stable 100% spin polarization or half-metallicity is very desirable in the field of spintronics, making Fe₂TiSn a highly attractive material. However, this composition suffers from an inherent anti-site disorder between Fe and Ti sites. This paper reports on the method adopted to control the anti-site disorder and the realization of the half-metallic ground state in Fe₂TiSn, achieved by chemical substitution. Here, Sb was substituted at Sn site to obtain Fe₂TiSn₁₋ₓSbₓ compositions with x = 0, 0.1, 0.25, 0.5 and 0.6. All prepared compositions with x ≤ 0.6 exhibit long-range L2₁ ordering and a decrease in Fe – Ti anti-site disorder. The transport and magnetic properties of Fe₂TiSn₁₋ₓSbₓ compositions were investigated as a function of temperature in the range, 5 K to 400 K. Electrical resistivity, magnetization, and Hall voltage measurements were carried out. All the experimental results indicate the presence of the half-metallic ground state in x ≥ 0.25 compositions. However, the value of saturation magnetization is small, indicating the presence of compensated magnetic moments. The observed magnetic moments' values are in close agreement with the Slater–Pauling rule in half-metallic systems. Magnetic interactions in Fe₂TiSn₁₋ₓSbₓ are understood from the local crystal structural perspective using extended X-ray absorption fine structure (EXAFS) spectroscopy. The changes in bond distances extracted from EXAFS analysis can be correlated with the hybridization between constituent atoms and hence the RKKY type magnetic interactions that govern the magnetic ground state of these alloys. To complement the experimental findings, first principle electronic structure calculations were also undertaken. The spin-polarized DOS complies with the experimental results for Fe₂TiSn₁₋ₓSbₓ. Substitution of Sb (an electron excess element) at Sn–site shifts the majority spin band to the lower energy side of Fermi level, thus making the system 100% spin polarized and inducing long-range magnetic order in an otherwise non-magnetic Fe₂TiSn. The present study concludes that a stable half-metallic system can be realized in Fe₂TiSn with ≥ 50% Sb – substitution at Sn – site.

Keywords: antisite disorder, EXAFS, Full Heusler alloy, half metallic ferrimagnetism, RKKY interactions

Procedia PDF Downloads 139
25108 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking

Authors: Trevor Toy, Josef Langerman

Abstract:

Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.

Keywords: big data markets, open banking, blockchain, personal data management

Procedia PDF Downloads 73
25107 Experimental Evaluation of Succinct Ternary Tree

Authors: Dmitriy Kuptsov

Abstract:

Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.

Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation

Procedia PDF Downloads 160
25106 Ensuring Cyber Security Using Kippo Honeypots

Authors: S. Vivekananda Pandian

Abstract:

A major challenging task in this current scenario is protecting your computer and other electronic gadgets against Cyber-attacks. In this current era Cyber warfare becomes a major threat to the entire world which targets a particular organization or a country spreading the Malwares, Breaching the securities, causing major loss to the organization. Several sectors both public and private are computerized such as Energy sectors, Oil refinery sectors, Defense sectors and Aviation sectors are prone to attacks. Several attacks are unknown while accessing the internet. To study the characteristics and Intention of the Attacker Kippo Honeypots are used. Honeypots are the trap set by us which enables them to monitor the malicious activities and detailed study about attackers which leads to strengthening of the security.

Keywords: attackers, security, Kippo Honeypots, virtual machine

Procedia PDF Downloads 427
25105 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement

Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti

Abstract:

Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.

Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing

Procedia PDF Downloads 108
25104 Advanced Techniques in Semiconductor Defect Detection: An Overview of Current Technologies and Future Trends

Authors: Zheng Yuxun

Abstract:

This review critically assesses the advancements and prospective developments in defect detection methodologies within the semiconductor industry, an essential domain that significantly affects the operational efficiency and reliability of electronic components. As semiconductor devices continue to decrease in size and increase in complexity, the precision and efficacy of defect detection strategies become increasingly critical. Tracing the evolution from traditional manual inspections to the adoption of advanced technologies employing automated vision systems, artificial intelligence (AI), and machine learning (ML), the paper highlights the significance of precise defect detection in semiconductor manufacturing by discussing various defect types, such as crystallographic errors, surface anomalies, and chemical impurities, which profoundly influence the functionality and durability of semiconductor devices, underscoring the necessity for their precise identification. The narrative transitions to the technological evolution in defect detection, depicting a shift from rudimentary methods like optical microscopy and basic electronic tests to more sophisticated techniques including electron microscopy, X-ray imaging, and infrared spectroscopy. The incorporation of AI and ML marks a pivotal advancement towards more adaptive, accurate, and expedited defect detection mechanisms. The paper addresses current challenges, particularly the constraints imposed by the diminutive scale of contemporary semiconductor devices, the elevated costs associated with advanced imaging technologies, and the demand for rapid processing that aligns with mass production standards. A critical gap is identified between the capabilities of existing technologies and the industry's requirements, especially concerning scalability and processing velocities. Future research directions are proposed to bridge these gaps, suggesting enhancements in the computational efficiency of AI algorithms, the development of novel materials to improve imaging contrast in defect detection, and the seamless integration of these systems into semiconductor production lines. By offering a synthesis of existing technologies and forecasting upcoming trends, this review aims to foster the dialogue and development of more effective defect detection methods, thereby facilitating the production of more dependable and robust semiconductor devices. This thorough analysis not only elucidates the current technological landscape but also paves the way for forthcoming innovations in semiconductor defect detection.

Keywords: semiconductor defect detection, artificial intelligence in semiconductor manufacturing, machine learning applications, technological evolution in defect analysis

Procedia PDF Downloads 52
25103 Prosperous Digital Image Watermarking Approach by Using DCT-DWT

Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar

Abstract:

In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacks

Keywords: watermarking, digital, DCT-DWT, security

Procedia PDF Downloads 423
25102 Machine Learning Data Architecture

Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap

Abstract:

Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.

Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning

Procedia PDF Downloads 64
25101 A Comparison of Image Data Representations for Local Stereo Matching

Authors: André Smith, Amr Abdel-Dayem

Abstract:

The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.

Keywords: colour data, local stereo matching, stereo correspondence, disparity map

Procedia PDF Downloads 370
25100 QIP: Introducing a Dedicated Ozurdex Clinic

Authors: Vaisnavy Govindasamy, Saba Ishrat

Abstract:

Introduction: The Dexamethasone Intravitreal Implant 0.7 mg (OzurdexTM, Allergan®) is a biodegradable corticosteroid implant approved by the FDA for managing diabetic macular edema (DMO), macular edema following branch retinal vein occlusion (BRVO) or central retinal vein occlusion (CRVO), and posterior segment non-infectious uveitis. This implant can release dexamethasone over a six-month period, exhibiting peak effectiveness between 60 and 90 days post-administration. The intravitreal injection should be performed under sterile conditions. At James Cook University Hospital (JCUH), Ozurdex injections are currently administered in the Vitreo-Retinal (VR) theatre. This study aimed to evaluate the feasibility and potential advantages of establishing a dedicated clinic for Ozurdex administration separate from the VR theatre setting. Method: Retrospectively, data of all Ozurdex injections administered between October 2021 to October 2022 was collected from operating theatre registers at JCUH. Data pertaining to the indications for Ozurdex; waiting times from referral date to date of injection; duration of theatre time consumed; and post-injection complications were collected from electronic notes. The resources needed to establish a dedicated Ozurdex clinic were evaluated. Over a six-month period from October 2023 to March 2024, we gathered data on utilization of theatre 28. Results: A total of 135 Ozurdex injections were administered. Among the indications, uveitis represented 47.3% of cases, DMO with 23.6% and RVO with 22.9%. Remaining cases lacked sufficient data. Each Ozurdex injection procedure consumed 15 minutes in the VR theatre list. Complications arose in 5% of injections, totaling 7 cases. These included glaucoma, ocular hypertension, subconjunctival haemorrhage and implant migration. Waiting times averaged 6 weeks from date for referral to procedure date. We also found that, on an average theatre 28 was offered but remained unused for 4 days, totalling eight sessions in a month. Analysis: Establishing a sperate Ozurdex clinic would improve the quality of patient care in following ways: 1.Decrease injection waiting times (currently averaging 6 weeks), leading to better visual outcomes. 2.Free up approximately three hours of theatre time in Vitreo-Retina theatres each month, allowing for 3-4 additional surgeries. Reduce waiting times for critical retinal surgeries and enhance visual outcomes. 3.Provide additional training opportunities for trainees and retina fellows, improving their skills. 4.Optimize the use of empty theatre slots (theatre 28) currently experiencing underutilization of resources. Conclusion: These findings support the implementation of a separate clinic for administering Ozurdex injections at JCUH. It is evident that introducing a dedicated clinic will enhance operational efficiency, optimise resource utilsation, and improve overall quality of care for patients undergoing this treatment.

Keywords: opthalmology, ozurdex, efficiency, complication

Procedia PDF Downloads 21
25099 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)

Procedia PDF Downloads 432
25098 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design

Authors: Qing K. Zhu

Abstract:

Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.

Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise

Procedia PDF Downloads 254
25097 Electronic Resources and Information Literacy in Higher Education Library

Authors: Nirmal Singh, Rajesh Kumar

Abstract:

Abstract- Information literacy aims to develop both critical understanding and active participation in scholars. It enables scholars to interpret and make informed judgments as users of information sources, and it also enables them to become producers of information in their own right, and thereby to become more powerful participants in society. Information literacy is about developing people‘s critical and creative abilities. Digital media – and particularly the Internet – significantly increase the potential for such active participation of the individual, provided scholars have the means and training to effectively access and use them. This paper provides definition, standards and importance of information literacy (IL). Keywords: Information literacy, Digital Media, Training, Communications Technologies.

Keywords: Information literacy, Digital Media, Training, , Communications Technologies

Procedia PDF Downloads 159
25096 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset

Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba

Abstract:

We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).

Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process

Procedia PDF Downloads 261
25095 Dimensionality Control of Li Transport by MOFs Based Quasi-Solid to Solid Electrolyte

Authors: Manuel Salado, Mikel Rincón, Arkaitz Fidalgo, Roberto Fernandez, Senentxu Lanceros-Méndez

Abstract:

Lithium-ion batteries (LIBs) are a promising technology for energy storage, but they suffer from safety concerns due to the use of flammable organic solvents in their liquid electrolytes. Solid-state electrolytes (SSEs) offer a potential solution to this problem, but they have their own limitations, such as poor ionic conductivity and high interfacial resistance. The aim of this research was to develop a new type of SSE based on metal-organic frameworks (MOFs) and ionic liquids (ILs). MOFs are porous materials with high surface area and tunable electronic properties, making them ideal for use in SSEs. ILs are liquid electrolytes that are non-flammable and have high ionic conductivity. A series of MOFs were synthesized, and their electrochemical properties were evaluated. The MOFs were then infiltrated with ILs to form a quasi-solid gel and solid xerogel SSEs. The ionic conductivity, interfacial resistance, and electrochemical performance of the SSEs were characterized. The results showed that the MOF-IL SSEs had significantly higher ionic conductivity and lower interfacial resistance than conventional SSEs. The SSEs also exhibited excellent electrochemical performance, with high discharge capacity and long cycle life. The development of MOF-IL SSEs represents a significant advance in the field of solid-state electrolytes. The high ionic conductivity and low interfacial resistance of the SSEs make them promising candidates for use in next-generation LIBs. The data for this research was collected using a variety of methods, including X-ray diffraction, scanning electron microscopy, and electrochemical impedance spectroscopy. The data was analyzed using a variety of statistical and computational methods, including principal component analysis, density functional theory, and molecular dynamics simulations. The main question addressed by this research was whether MOF-IL SSEs could be developed that have high ionic conductivity, low interfacial resistance, and excellent electrochemical performance. The results of this research demonstrate that MOF-IL SSEs are a promising new type of solid-state electrolyte for use in LIBs. The SSEs have high ionic conductivity, low interfacial resistance, and excellent electrochemical performance. These properties make them promising candidates for use in next-generation LIBs that are safer and have higher energy densities.

Keywords: energy storage, solid-electrolyte, ionic liquid, metal-organic-framework, electrochemistry, organic inorganic plastic crystal

Procedia PDF Downloads 83
25094 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator

Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain

Abstract:

Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.

Keywords: percent depth dose, flatness, symmetry, golden beam data

Procedia PDF Downloads 489
25093 Variable-Fidelity Surrogate Modelling with Kriging

Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans

Abstract:

Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.

Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients

Procedia PDF Downloads 558
25092 Robust Barcode Detection with Synthetic-to-Real Data Augmentation

Authors: Xiaoyan Dai, Hsieh Yisan

Abstract:

Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.

Keywords: barcode detection, data augmentation, deep learning, image-based processing

Procedia PDF Downloads 169
25091 First and Second Order Gm-C Filters

Authors: Rana Mahmoud

Abstract:

This study represents a systematic study of the Operational Transconductance Amplifiers capacitance (OTA-C) filters or as it is often called Gm-C filters. OTA-C filters have been paid a great attention for the last decades. As Gm-C filters operate in an open loop topology, this makes them flexible to perform in low and high frequencies. As such, Gm-C filters can be used in various wireless communication applications. Another property of Gm-C filters is its electronic tunability, thus different filter frequency characteristics can be obtained without changing the inductance and resistance values. This can be achieved by an OTA (Operational Transconductance Amplifier) and a capacitor. By tuning the OTA transconductance, the cut-off frequency will be tuned and different frequency responses are achieved. Different high-order analog filters can be design using Gm-C filters including low pass, high pass and band pass filters. 1st and 2nd order low pass, high pass and band pass filters are presented in this paper.

Keywords: Gm-C, filters, low-pass, high-pass, band-pass

Procedia PDF Downloads 131
25090 The Impact of Pediatric Cares, Infections and Vaccines on Community and People’s Lives

Authors: Nashed Atef Nashed Farag

Abstract:

Introduction: Reporting adverse events following vaccination remains a challenge. WHO has mandated pharmacovigilance centers around the world to submit Adverse Events Following Immunization (AEFI) reports from different countries to a large electronic database of adverse drug event data called Vigibase. Despite sufficient information about AEFIs on Vigibase, they are not available to the general public. However, the WHO has an alternative website called VigiAccess, an open-access website that serves as an archive for reported adverse reactions and AEFIs. The aim of the study was to establish a reporting model for a number of commonly used vaccines in the VigiAccess system. Methods: On February 5, 2018, VigiAccess comprehensively searched for ESSI reports on the measles vaccine, oral polio vaccine (OPV), yellow fever vaccine, pneumococcal vaccine, rotavirus vaccine, meningococcal vaccine, tetanus vaccine, and tuberculosis vaccine (BCG). These are reports from all pharmacovigilance centers around the world since they joined the WHO Drug Monitoring Program. Results: After an extensive search, VigiAccess found 9,062 AEFIs from the measles vaccine, 185,829 AEFIs from the OPV vaccine, 24,577 AEFIs from the yellow fever vaccine, 317,208 AEFIs from the pneumococcal vaccine, 73,513 AEFIs from the rotavirus vaccine, and 145,447 AEFIs from meningococcal cal vaccine, 22,781 EI FI vaccines against tetanus and 35,556 BCG vaccines against AEFI. Conclusion: The study found that among the eight vaccines examined, pneumococcal vaccines were associated with the highest number of AEFIs, while measles vaccines were associated with the fewest AEFIs.

Keywords: surgical approach, anatomical approach, decompression, axillary nerve, quadrangular space adverse events following immunization, cameroon, COVID-19 vaccines, nOPV, ODK vaccines, adverse reactions, VigiAccess, adverse event reporting

Procedia PDF Downloads 72
25089 Utility of CT Perfusion Imaging for Diagnosis and Management of Delayed Cerebral Ischaemia Following Subarachnoid Haemorrhage

Authors: Abdalla Mansour, Dan Brown, Adel Helmy, Rikin Trivedi, Mathew Guilfoyle

Abstract:

Introduction: Diagnosing delayed cerebral ischaemia (DCI) following aneurysmal subarachnoid haemorrhage (SAH) can be challenging, particularly in poor-grade patients. Objectives: This study sought to assess the value of routine CTP in identifying (or excluding) DCI and in guiding management. Methods: Eight-year retrospective neuroimaging study at a large UK neurosurgical centre. Subjects included a random sample of adult patients with confirmed aneurysmal SAH that had a CTP scan during their inpatient stay, over a 8-year period (May 2014 - May 2022). Data collected through electronic patient record and PACS. Variables included age, WFNS scale, aneurysm site, treatment, the timing of CTP, radiologist report, and DCI management. Results: Over eight years, 916 patients were treated for aneurysmal SAH; this study focused on 466 patients that were randomly selected. Of this sample, 181 (38.84%) had one or more CTP scans following brain aneurysm treatment (Total 318). The first CTP scan in each patient was performed at 1-20 days following ictus (median 4 days). There was radiological evidence of DCI in 83, and no reversible ischaemia was found in 80. Findings were equivocal in the remaining 18. Of the 103 patients treated with clipping, 49 had DCI radiological evidence, in comparison to 31 of 69 patients treated with endovascular embolization. The remaining 9 patients are either unsecured aneurysms or non-aneurysmal SAH. Of the patients with radiological evidence of DCI, 65 had a treatment change following the CTP directed at improving cerebral perfusion. In contrast, treatment was not changed for (61) patients without radiological evidence of DCI. Conclusion: CTP is a useful adjunct to clinical assessment in the diagnosis of DCI and is helpful in identifying patients that may benefit from intensive therapy and those in whom it is unlikely to be effective.

Keywords: SAH, vasospasm, aneurysm, delayed cerebral ischemia

Procedia PDF Downloads 68
25088 Analysis of Delivery of Quad Play Services

Authors: Rahul Malhotra, Anurag Sharma

Abstract:

Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.

Keywords: FTTH, quad play, play service, access networks, data rate

Procedia PDF Downloads 416
25087 The Therapeutic Effects of Acupuncture on Oral Dryness and Antibody Modification in Sjogren Syndrome: A Meta-Analysis

Authors: Tzu-Hao Li, Yen-Ying Kung, Chang-Youh Tsai

Abstract:

Oral dryness is a common chief complaint among patients with Sjőgren syndrome (SS), which is a disorder currently known as autoantibodies production; however, to author’s best knowledge, there has been no satisfying pharmacy to relieve the associated symptoms. Hence the effectiveness of other non-pharmacological interventions such as acupuncture should be accessed. We conducted a meta-analysis of randomized clinical trials (RCTs) which evaluated the effectiveness of xerostomia in SS. PubMed, Embase, Cochrane Central Register of Controlled Trials (CENTRAL), Chongqing Weipu Database (CQVIP), China Academic Journals Full-text Database, AiritiLibrary, Chinese Electronic Periodicals Service (CEPS), China National Knowledge Infrastructure (CNKI) Database were searches through May 12, 2018 to select studies. Data for evaluation of subjective and objective xerostomia was extracted and was assessed with random-effects meta-analysis. After searching, a total of 541 references were yielded and five RCTs were included, covering 340 patients dry mouth resulted from SS, among whom 169 patients received acupuncture and 171 patients were control group. Acupuncture group was associated with higher subjective response rate (odds ratio 3.036, 95% confidence interval [CI] 1.828 – 5.042, P < 0.001) and increased salivary flow rate (weighted mean difference [WMD] 3.066, 95% CI 2.969 – 3.164, P < 0.001), as an objective marker. In addition, two studies examined IgG levels, which were lower in the acupuncture group (WMD -166.857, 95% CI -233.138 - -100.576, P < 0.001). Therefore, in the present meta-analysis, acupuncture improves both subjective and objective markers of dry mouth with autoantibodies reduction in patients with SS and is considered as an option of non-pharmacological treatment for SS.

Keywords: acupuncture, meta-analysis, Sjogren syndrome, xerostomia

Procedia PDF Downloads 125
25086 The Effect of Artificial Intelligence on Banking Development and Progress

Authors: Mina Malak Hanna Saad

Abstract:

New strategies for supplying banking services to the customer have been brought, which include online banking. Banks have begun to recall electronic banking (e-banking) as a manner to replace some conventional department features by means of the usage of the internet as a brand-new distribution channel. A few clients have at least one account at multiple banks and get admission to those debts through online banking. To test their present-day internet worth, customers need to log into each of their debts, get particular statistics, and paint closer to consolidation. Not only is it time-ingesting; however, but it is also a repeatable activity with a certain frequency. To solve this problem, the idea of account aggregation was delivered as a solution. Account consolidation in e-banking as a form of digital banking appears to build stronger dating with clients. An account linking service is usually known as a service that permits customers to manipulate their bank accounts held at exceptional institutions through a common online banking platform that places a high priority on safety and statistics protection. The object affords an outline of the account aggregation approach in e-banking as a distinct carrier in the area of e-banking. The advanced facts generation is becoming a vital thing in the improvement of financial services enterprise, specifically the banking enterprise. It has brought different ways of delivering banking to the purchaser, which includes net Banking. Banks began to study electronic banking (e-banking) as a means to update some of their traditional branch functions and the use of the net as a distribution channel. Some clients have at least multiple accounts throughout banks and get the right of entry to that money owed through the usage of e-banking offerings. To examine the contemporary internet's well-worth position, customers have to log in to each of their money owed, get the information and work on consolidation. This no longer takes sufficient time; however, it is a repetitive interest at a specified frequency. To address this point, an account aggregation idea is brought as an answer. E-banking account aggregation, as one of the e-banking kinds, appeared to construct a more potent dating with clients. Account Aggregation carrier usually refers to a service that allows clients to control their bank bills maintained in one-of-a-kind institutions via a common Internet banking working platform, with an excessive subject to protection and privateness. This paper offers an overview of an e-banking account aggregation technique as a new provider in the e-banking field.

Keywords: compatibility, complexity, mobile banking, observation, risk banking technology, Internet banks, modernization of banks, banks, account aggregation, security, enterprise developmente-banking, enterprise development

Procedia PDF Downloads 36
25085 L2 Reading in Distance Education: Analysis of Students' Reading Attitude and Interests

Authors: Ma. Junithesmer, D. Rosales

Abstract:

The study is a baseline description of students’ attitude and interests about L2 reading in a state university in the Philippines that uses distance education as a delivery mode. Most research conducted on this area dealt with the analysis of reading in a traditional school set-up. For this reason, this research was written to discover if there are implications as regards students’ preferences, interests and attitude reveal about L2 reading in a non-traditional set-up. To form the corpus of this study, it included the literature and studies about reading, preferred technological devices, titles of books and authors, reading medium traditional/ print and electronic books that juxtapose with students’ interest and feelings when reading at home and in school; and their views about their strengths and weaknesses as readers.

Keywords: distance education, L2 reading, reading, reading attitude

Procedia PDF Downloads 345
25084 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network

Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson

Abstract:

The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.

Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0

Procedia PDF Downloads 182