Search results for: push data
24986 Developing Early Intervention Tools: Predicting Academic Dishonesty in University Students Using Psychological Traits and Machine Learning
Authors: Pinzhe Zhao
Abstract:
This study focuses on predicting university students' cheating tendencies using psychological traits and machine learning techniques. Academic dishonesty is a significant issue that compromises the integrity and fairness of educational institutions. While much research has been dedicated to detecting cheating behaviors after they have occurred, there is limited work on predicting such tendencies before they manifest. The aim of this research is to develop a model that can identify students who are at higher risk of engaging in academic misconduct, allowing for earlier interventions to prevent such behavior. Psychological factors are known to influence students' likelihood of cheating. Research shows that traits such as test anxiety, moral reasoning, self-efficacy, and achievement motivation are strongly linked to academic dishonesty. High levels of anxiety may lead students to cheat as a way to cope with pressure. Those with lower self-efficacy are less confident in their academic abilities, which can push them toward dishonest behaviors to secure better outcomes. Students with weaker moral judgment may also justify cheating more easily, believing it to be less wrong under certain conditions. Achievement motivation also plays a role, as students driven primarily by external rewards, such as grades, are more likely to cheat compared to those motivated by intrinsic learning goals. In this study, data on students’ psychological traits is collected through validated assessments, including scales for anxiety, moral reasoning, self-efficacy, and motivation. Additional data on academic performance, attendance, and engagement in class are also gathered to create a more comprehensive profile. Using machine learning algorithms such as Random Forest, Support Vector Machines (SVM), and Long Short-Term Memory (LSTM) networks, the research builds models that can predict students’ cheating tendencies. These models are trained and evaluated using metrics like accuracy, precision, recall, and F1 scores to ensure they provide reliable predictions. The findings demonstrate that combining psychological traits with machine learning provides a powerful method for identifying students at risk of cheating. This approach allows for early detection and intervention, enabling educational institutions to take proactive steps in promoting academic integrity. The predictive model can be used to inform targeted interventions, such as counseling for students with high test anxiety or workshops aimed at strengthening moral reasoning. By addressing the underlying factors that contribute to cheating behavior, educational institutions can reduce the occurrence of academic dishonesty and foster a culture of integrity. In conclusion, this research contributes to the growing body of literature on predictive analytics in education. It offers a approach by integrating psychological assessments with machine learning to predict cheating tendencies. This method has the potential to significantly improve how academic institutions address academic dishonesty, shifting the focus from punishment after the fact to prevention before it occurs. By identifying high-risk students and providing them with the necessary support, educators can help maintain the fairness and integrity of the academic environment.Keywords: academic dishonesty, cheating prediction, intervention strategies, machine learning, psychological traits, academic integrity
Procedia PDF Downloads 2024985 Multimedia Data Fusion for Event Detection in Twitter by Using Dempster-Shafer Evidence Theory
Authors: Samar M. Alqhtani, Suhuai Luo, Brian Regan
Abstract:
Data fusion technology can be the best way to extract useful information from multiple sources of data. It has been widely applied in various applications. This paper presents a data fusion approach in multimedia data for event detection in twitter by using Dempster-Shafer evidence theory. The methodology applies a mining algorithm to detect the event. There are two types of data in the fusion. The first is features extracted from text by using the bag-ofwords method which is calculated using the term frequency-inverse document frequency (TF-IDF). The second is the visual features extracted by applying scale-invariant feature transform (SIFT). The Dempster - Shafer theory of evidence is applied in order to fuse the information from these two sources. Our experiments have indicated that comparing to the approaches using individual data source, the proposed data fusion approach can increase the prediction accuracy for event detection. The experimental result showed that the proposed method achieved a high accuracy of 0.97, comparing with 0.93 with texts only, and 0.86 with images only.Keywords: data fusion, Dempster-Shafer theory, data mining, event detection
Procedia PDF Downloads 41024984 Legal Issues of Collecting and Processing Big Health Data in the Light of European Regulation 679/2016
Authors: Ioannis Iglezakis, Theodoros D. Trokanas, Panagiota Kiortsi
Abstract:
This paper aims to explore major legal issues arising from the collection and processing of Health Big Data in the light of the new European secondary legislation for the protection of personal data of natural persons, placing emphasis on the General Data Protection Regulation 679/2016. Whether Big Health Data can be characterised as ‘personal data’ or not is really the crux of the matter. The legal ambiguity is compounded by the fact that, even though the processing of Big Health Data is premised on the de-identification of the data subject, the possibility of a combination of Big Health Data with other data circulating freely on the web or from other data files cannot be excluded. Another key point is that the application of some provisions of GPDR to Big Health Data may both absolve the data controller of his legal obligations and deprive the data subject of his rights (e.g., the right to be informed), ultimately undermining the fundamental right to the protection of personal data of natural persons. Moreover, data subject’s rights (e.g., the right not to be subject to a decision based solely on automated processing) are heavily impacted by the use of AI, algorithms, and technologies that reclaim health data for further use, resulting in sometimes ambiguous results that have a substantial impact on individuals. On the other hand, as the COVID-19 pandemic has revealed, Big Data analytics can offer crucial sources of information. In this respect, this paper identifies and systematises the legal provisions concerned, offering interpretative solutions that tackle dangers concerning data subject’s rights while embracing the opportunities that Big Health Data has to offer. In addition, particular attention is attached to the scope of ‘consent’ as a legal basis in the collection and processing of Big Health Data, as the application of data analytics in Big Health Data signals the construction of new data and subject’s profiles. Finally, the paper addresses the knotty problem of role assignment (i.e., distinguishing between controller and processor/joint controllers and joint processors) in an era of extensive Big Health data sharing. The findings are the fruit of a current research project conducted by a three-member research team at the Faculty of Law of the Aristotle University of Thessaloniki and funded by the Greek Ministry of Education and Religious Affairs.Keywords: big health data, data subject rights, GDPR, pandemic
Procedia PDF Downloads 12924983 Adaptive Data Approximations Codec (ADAC) for AI/ML-based Cyber-Physical Systems
Authors: Yong-Kyu Jung
Abstract:
The fast growth in information technology has led to de-mands to access/process data. CPSs heavily depend on the time of hardware/software operations and communication over the network (i.e., real-time/parallel operations in CPSs (e.g., autonomous vehicles). Since data processing is an im-portant means to overcome the issue confronting data management, reducing the gap between the technological-growth and the data-complexity and channel-bandwidth. An adaptive perpetual data approximation method is intro-duced to manage the actual entropy of the digital spectrum. An ADAC implemented as an accelerator and/or apps for servers/smart-connected devices adaptively rescales digital contents (avg.62.8%), data processing/access time/energy, encryption/decryption overheads in AI/ML applications (facial ID/recognition).Keywords: adaptive codec, AI, ML, HPC, cyber-physical, cybersecurity
Procedia PDF Downloads 7924982 Determination of the Pull-Out/ Holding Strength at the Taper-Trunnion Junction of Hip Implants
Authors: Obinna K. Ihesiulor, Krishna Shankar, Paul Smith, Alan Fien
Abstract:
Excessive fretting wear at the taper-trunnion junction (trunnionosis) apparently contributes to the high failure rates of hip implants. Implant wear and corrosion lead to the release of metal particulate debris and subsequent release of metal ions at the taper-trunnion surface. This results in a type of metal poisoning referred to as metallosis. The consequences of metal poisoning include; osteolysis (bone loss), osteoarthritis (pain), aseptic loosening of the prosthesis and revision surgery. Follow up after revision surgery, metal debris particles are commonly found in numerous locations. Background: A stable connection between the femoral ball head (taper) and stem (trunnion) is necessary to prevent relative motions and corrosion at the taper junction. Hence, the importance of component assembly cannot be over-emphasized. Therefore, the aim of this study is to determine the influence of head-stem junction assembly by press fitting and the subsequent disengagement/disassembly on the connection strength between the taper ball head and stem. Methods: CoCr femoral heads were assembled with High stainless hydrogen steel stem (trunnion) by Push-in i.e. press fit; and disengaged by Pull-out test. The strength and stability of the two connections were evaluated by measuring the head pull-out forces according to ISO 7206-10 standards. Findings: The head-stem junction strength linearly increases with assembly forces.Keywords: wear, modular hip prosthesis, taper head-stem, force assembly and disassembly
Procedia PDF Downloads 40024981 High Performance Wood Shear Walls and Dissipative Anchors for Damage Limitation
Authors: Vera Wilden, Benno Hoffmeister, Georgios Balaskas, Lukas Rauber, Burkhard Walter
Abstract:
Light-weight timber frame elements represent an efficient structural solution for wooden multistory buildings. The wall elements of such buildings – which act as shear diaphragms- provide lateral stiffness and resistance to wind and seismic loads. The tendency towards multi-story structures leads to challenges regarding the prediction of stiffness, strength and ductility of the buildings. Lightweight timber frame elements are built up of several structural parts (sheeting, fasteners, frame, support and anchorages); each of them contributing to the dynamic response of the structure. This contribution describes the experimental and numerical investigation and development of enhanced lightweight timber frame buildings. These developments comprise high-performance timber frame walls with the variable arrangements of sheathing planes and dissipative anchors at the base of the timber buildings, which reduce damages to the timber structure and can be exchanged after significant earthquakes. In order to prove the performance of the developed elements in the context of a real building a full-scale two-story building core was designed and erected in the laboratory and tested experimentally for its seismic performance. The results of the tests and a comparison of the test results to the predicted behavior are presented. Observation during the test also reveals some aspects of the design and details which need to consider in the application of the timber walls in the context of the complete building.Keywords: dissipative anchoring, full scale test, push-over-test, wood shear walls
Procedia PDF Downloads 24724980 Real-Time Visualization Using GPU-Accelerated Filtering of LiDAR Data
Authors: Sašo Pečnik, Borut Žalik
Abstract:
This paper presents a real-time visualization technique and filtering of classified LiDAR point clouds. The visualization is capable of displaying filtered information organized in layers by the classification attribute saved within LiDAR data sets. We explain the used data structure and data management, which enables real-time presentation of layered LiDAR data. Real-time visualization is achieved with LOD optimization based on the distance from the observer without loss of quality. The filtering process is done in two steps and is entirely executed on the GPU and implemented using programmable shaders.Keywords: filtering, graphics, level-of-details, LiDAR, real-time visualization
Procedia PDF Downloads 30824979 Indian Emigration to Gulf Countries: Opportunities and Challenges
Authors: Sudhaveni Naresh
Abstract:
International migration is an important subject and gaining more significance andinterest among scholars in recent years. It is defined as crossing of the boundaries of political or administrative units for a certain minimum period for reasons such as education, employment, etc.International migration is not new for India because it has a long history with the Gulf region since ancient period. India is also one of the largest migrant-sending countries after China in the world. Migration towards the Gulf region became more prominent during early 1970s due to oil boom which led to rapid increase in the demand for foreign labour. Of 25 million Indian emigrants are living across the world, about six million Indian emigrants working in the Gulf. Most of these migrants were either unskilled or semi-skilled. Both the pull and push factors behind labour emigrate to Gulf countries. India is world’s leading receiver of remittances and the flow of remittances to India has been increasing steadily since the 1970s. In 2011-12, it was about 4 percent of GDP.Emigrants play a significant role in the economic development and growth of the country via the remittances and knowledge and skill transfer. Scholars see remittances as vital tool in the development for origin country. This paper examines the recent trend and pattern of migration from India to Gulf countries and explores impact of remittances on emigrants’ families at home country. It also highlights opportunities, challenges and the need for strengthening multilateral cooperation to transform migration into an efficient, orderly and humane process.The study propose to undertake a primary survey for this purpose. Both quantitative and qualitative research methods will be used to study the above issues.Keywords: development, international migration, remittances, unskilled labour
Procedia PDF Downloads 29124978 Estimating Destinations of Bus Passengers Using Smart Card Data
Authors: Hasik Lee, Seung-Young Kho
Abstract:
Nowadays, automatic fare collection (AFC) system is widely used in many countries. However, smart card data from many of cities does not contain alighting information which is necessary to build OD matrices. Therefore, in order to utilize smart card data, destinations of passengers should be estimated. In this paper, kernel density estimation was used to forecast probabilities of alighting stations of bus passengers and applied to smart card data in Seoul, Korea which contains boarding and alighting information. This method was also validated with actual data. In some cases, stochastic method was more accurate than deterministic method. Therefore, it is sufficiently accurate to be used to build OD matrices.Keywords: destination estimation, Kernel density estimation, smart card data, validation
Procedia PDF Downloads 35224977 Evaluated Nuclear Data Based Photon Induced Nuclear Reaction Model of GEANT4
Authors: Jae Won Shin
Abstract:
We develop an evaluated nuclear data based photonuclear reaction model of GEANT4 for a more accurate simulation of photon-induced neutron production. The evaluated photonuclear data libraries from the ENDF/B-VII.1 are taken as input. Incident photon energies up to 140 MeV which is the threshold energy for the pion production are considered. For checking the validity of the use of the data-based model, we calculate the photoneutron production cross-sections and yields and compared them with experimental data. The results obtained from the developed model are found to be in good agreement with the experimental data for (γ,xn) reactions.Keywords: ENDF/B-VII.1, GEANT4, photoneutron, photonuclear reaction
Procedia PDF Downloads 27524976 Optimizing Communications Overhead in Heterogeneous Distributed Data Streams
Authors: Rashi Bhalla, Russel Pears, M. Asif Naeem
Abstract:
In this 'Information Explosion Era' analyzing data 'a critical commodity' and mining knowledge from vertically distributed data stream incurs huge communication cost. However, an effort to decrease the communication in the distributed environment has an adverse influence on the classification accuracy; therefore, a research challenge lies in maintaining a balance between transmission cost and accuracy. This paper proposes a method based on Bayesian inference to reduce the communication volume in a heterogeneous distributed environment while retaining prediction accuracy. Our experimental evaluation reveals that a significant reduction in communication can be achieved across a diverse range of dataset types.Keywords: big data, bayesian inference, distributed data stream mining, heterogeneous-distributed data
Procedia PDF Downloads 16124975 Challenges beyond the Singapore Future-Ready School ‘LEADER’ Qualities
Authors: Zoe Boon Suan Loy
Abstract:
An exploratory research undertaken in 2000 at the beginning of the COVID-19 pandemic examined the changing roles of Singapore school leaders as they lead teachers in developing future-ready learners. While it is evident that ‘LEADER’ qualities epitomize the knowledge, competencies, and skills required, recent events in an increasing VUCA and BANI world characterized by massively disruptive Ukraine -Russian war, unabating tense US-Sino relations, issues related to sustainability, and rapid ageing will have an impact on school leadership. As an increasingly complex endeavour, this requires a relook as they lead teachers in nurturing holistically-developed future-ready students. Digitalisation, new technology, and the push for a green economy will be the key driving forces that will have an impact on job availability. Similarly, the rapid growth of artificial intelligence (AI) capabilities, including ChatGPT, will aggravate and add tremendous stress to the work of school leaders. This paper seeks to explore the key school leadership shifts required beyond the ‘LEADER’ qualities as school leaders respond to the changes, challenges, and opportunities in the 21st C new normal. The research findings for this paper are based on an exploratory qualitative study on the perceptions of 26 school leaders (vice-principals) who were attending a milestone educational leadership course at the National Institute of Education, Nanyang Technological University, Singapore. A structured questionnaire is designed to collect the data, which is then analysed using coding methodology. Broad themes on key competencies and skills of future-ready leaders in the Singapore education system are then identified. Key Findings: In undertaking their leadership roles as leaders of future-ready learners, school leaders need to demonstrate the ‘LEADER’ qualities. They need to have a long-term view, understand the educational imperatives, have a good awareness of self and the dispositions of a leader, be effective in optimizing external leverages and are clear about their role expectations. These ‘LEADER’ qualities are necessary and relevant in the post-Covid era. Beyond this, school leaders with ‘LEADER’ qualities are well supported by the Ministry of Education, which takes cognizance of emerging trends and continually review education policies to address related issues. Concluding Statement: Discussions within the education ecosystem and among other stakeholders on the implications of the use of artificial intelligence and ChatGPT on the school curriculum, including content knowledge, pedagogy, and assessment, are ongoing. This augurs well for school leaders as they undertake their responsibilities as leaders of future-ready learners.Keywords: Singapore education system, ‘LEADER’ qualities, school leadership, future-ready leaders, future-ready learners
Procedia PDF Downloads 7224974 Data Privacy: Stakeholders’ Conflicts in Medical Internet of Things
Authors: Benny Sand, Yotam Lurie, Shlomo Mark
Abstract:
Medical Internet of Things (MIoT), AI, and data privacy are linked forever in a gordian knot. This paper explores the conflicts of interests between the stakeholders regarding data privacy in the MIoT arena. While patients are at home during healthcare hospitalization, MIoT can play a significant role in improving the health of large parts of the population by providing medical teams with tools for collecting data, monitoring patients’ health parameters, and even enabling remote treatment. While the amount of data handled by MIoT devices grows exponentially, different stakeholders have conflicting understandings and concerns regarding this data. The findings of the research indicate that medical teams are not concerned by the violation of data privacy rights of the patients' in-home healthcare, while patients are more troubled and, in many cases, are unaware that their data is being used without their consent. MIoT technology is in its early phases, and hence a mixed qualitative and quantitative research approach will be used, which will include case studies and questionnaires in order to explore this issue and provide alternative solutions.Keywords: MIoT, data privacy, stakeholders, home healthcare, information privacy, AI
Procedia PDF Downloads 10224973 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations
Authors: Deepak Singh, Rail Kuliev
Abstract:
The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization
Procedia PDF Downloads 7024972 Influence of Parameters of Modeling and Data Distribution for Optimal Condition on Locally Weighted Projection Regression Method
Authors: Farhad Asadi, Mohammad Javad Mollakazemi, Aref Ghafouri
Abstract:
Recent research in neural networks science and neuroscience for modeling complex time series data and statistical learning has focused mostly on learning from high input space and signals. Local linear models are a strong choice for modeling local nonlinearity in data series. Locally weighted projection regression is a flexible and powerful algorithm for nonlinear approximation in high dimensional signal spaces. In this paper, different learning scenario of one and two dimensional data series with different distributions are investigated for simulation and further noise is inputted to data distribution for making different disordered distribution in time series data and for evaluation of algorithm in locality prediction of nonlinearity. Then, the performance of this algorithm is simulated and also when the distribution of data is high or when the number of data is less the sensitivity of this approach to data distribution and influence of important parameter of local validity in this algorithm with different data distribution is explained.Keywords: local nonlinear estimation, LWPR algorithm, online training method, locally weighted projection regression method
Procedia PDF Downloads 50224971 Big Data Strategy for Telco: Network Transformation
Abstract:
Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and next-generation network, however, are more exorbitant than improved customer relationship management. Next generation of networks are in a prime position to monetize rich supplies of customer information—while being mindful of legal and privacy issues. As data assets are transformed into new revenue streams will become integral to high performance.Keywords: big data, next generation networks, network transformation, strategy
Procedia PDF Downloads 36024970 REDUCER: An Architectural Design Pattern for Reducing Large and Noisy Data Sets
Authors: Apkar Salatian
Abstract:
To relieve the burden of reasoning on a point to point basis, in many domains there is a need to reduce large and noisy data sets into trends for qualitative reasoning. In this paper we propose and describe a new architectural design pattern called REDUCER for reducing large and noisy data sets that can be tailored for particular situations. REDUCER consists of 2 consecutive processes: Filter which takes the original data and removes outliers, inconsistencies or noise; and Compression which takes the filtered data and derives trends in the data. In this seminal article, we also show how REDUCER has successfully been applied to 3 different case studies.Keywords: design pattern, filtering, compression, architectural design
Procedia PDF Downloads 21224969 Fuzzy Expert Systems Applied to Intelligent Design of Data Centers
Authors: Mario M. Figueroa de la Cruz, Claudia I. Solorzano, Raul Acosta, Ignacio Funes
Abstract:
This technological development project seeks to create a tool that allows companies, in need of implementing a Data Center, intelligently determining factors for allocating resources support cooling and power supply (UPS) in its conception. The results should show clearly the speed, robustness and reliability of a system designed for deployment in environments where they must manage and protect large volumes of data.Keywords: telecommunications, data center, fuzzy logic, expert systems
Procedia PDF Downloads 34524968 Exploring the Process of Cultivating Tolerance: The Case of a Pakistani University
Authors: Uzma Rashid, Mommnah Asad
Abstract:
As more and more people fall victim to the intolerance that has become a plague globally, academicians are faced with the herculean task of sowing the roots for more tolerant individuals. Being the multilayered task that it is, promoting an acceptance of diversity and pushing an agenda to push back hate requires efforts on multiple levels. Not only does the curriculum need to be in line with such goals, but teachers also need to be trained to cater to the sensitivities surrounding conversations of tolerance and diversity. In addition, institutional support needs to be there to provide conducive conditions for a diversity driven learning process to take place. In reality, teachers have to struggle with forwarding ideas about diversity and tolerance which do not sound particularly risky to be shared but given the current socio-political and religious milieu, can put the teacher in a difficult position and can make the task exponentially challenging. This paper is based on an auto-ethnographic account of teaching undergraduate and graduate courses at a private university in Pakistan. These courses were aimed at teaching tolerance to adult learners through classes focused on key notions pertaining to religion, culture, gender, and society. Authors’ classroom experiences with the students in these courses indicate a marked heightening of religious sensitivities that can potentially threaten a teacher’s life chances and become a hindrance in deep, meaningful conversations, thus lending a superficiality to the whole endeavor. The paper will discuss in detail the challenges that this teacher dealt with in the process, how those were addressed, and locate them in the larger picture of how tolerance can be materialized in current times in the universities in Pakistan and in similar contexts elsewhere.Keywords: tolerance, diversity, gender, Pakistani Universities
Procedia PDF Downloads 15724967 Genetic Testing and Research in South Africa: The Sharing of Data Across Borders
Authors: Amy Gooden, Meshandren Naidoo
Abstract:
Genetic research is not confined to a particular jurisdiction. Using direct-to-consumer genetic testing (DTC-GT) as an example, this research assesses the status of data sharing into and out of South Africa (SA). While SA laws cover the sending of genetic data out of SA, prohibiting such transfer unless a legal ground exists, the position where genetic data comes into the country depends on the laws of the country from where it is sent – making the legal position less clear.Keywords: cross-border, data, genetic testing, law, regulation, research, sharing, South Africa
Procedia PDF Downloads 16124966 The Effect of Main Factors on Forces during FSJ Processing of AA2024 Aluminum
Authors: Dunwen Zuo, Yongfang Deng, Bo Song
Abstract:
An attempt is made here to measure the forces of three directions, under conditions of different feed speeds, different tilt angles of tool and without or with the pin on the tool, by using octagonal ring dynamometer in the AA2024 aluminum FSJ (Friction Stir Joining) process, and investigate how four main factors influence forces in the FSJ process. It is found that, high feed speed lead to small feed force and small lateral force, but high feed speed leads to large feed force in the stable joining stage of process. As the rotational speed increasing, the time of axial force drop from the maximum to the minimum required increased in the push-up process. In the stable joining stage, the rotational speed has little effect on the feed force; large rotational speed leads to small lateral force and axial force. The maximum axial force increases as the tilt angle of tool increases at the downward movement stage. At the moment of start feeding, as tilt angle of tool increases, the amplitudes of the axial force increasing become large. In the stable joining stage, with the increase of tilt angle of tool, the axial force is increased, the lateral force is decreased, and the feed force almost unchanged. The tool with pin will decrease axial force in the downward movement stage. The feed force and lateral force will increase, but the axial force will reduced in the stable joining stage by using the tool with pin compare to by using the tool without pin.Keywords: FSJ, force factor, AA2024 aluminum, friction stir joining
Procedia PDF Downloads 49124965 Design of a Low Cost Motion Data Acquisition Setup for Mechatronic Systems
Authors: Baris Can Yalcin
Abstract:
Motion sensors have been commonly used as a valuable component in mechatronic systems, however, many mechatronic designs and applications that need motion sensors cost enormous amount of money, especially high-tech systems. Design of a software for communication protocol between data acquisition card and motion sensor is another issue that has to be solved. This study presents how to design a low cost motion data acquisition setup consisting of MPU 6050 motion sensor (gyro and accelerometer in 3 axes) and Arduino Mega2560 microcontroller. Design parameters are calibration of the sensor, identification and communication between sensor and data acquisition card, interpretation of data collected by the sensor.Keywords: design, mechatronics, motion sensor, data acquisition
Procedia PDF Downloads 58824964 Speed Characteristics of Mixed Traffic Flow on Urban Arterials
Authors: Ashish Dhamaniya, Satish Chandra
Abstract:
Speed and traffic volume data are collected on different sections of four lane and six lane roads in three metropolitan cities in India. Speed data are analyzed to fit the statistical distribution to individual vehicle speed data and all vehicles speed data. It is noted that speed data of individual vehicle generally follows a normal distribution but speed data of all vehicle combined at a section of urban road may or may not follow the normal distribution depending upon the composition of traffic stream. A new term Speed Spread Ratio (SSR) is introduced in this paper which is the ratio of difference in 85th and 50th percentile speed to the difference in 50th and 15th percentile speed. If SSR is unity then speed data are truly normally distributed. It is noted that on six lane urban roads, speed data follow a normal distribution only when SSR is in the range of 0.86 – 1.11. The range of SSR is validated on four lane roads also.Keywords: normal distribution, percentile speed, speed spread ratio, traffic volume
Procedia PDF Downloads 42224963 An Exploratory Analysis of Brisbane's Commuter Travel Patterns Using Smart Card Data
Authors: Ming Wei
Abstract:
Over the past two decades, Location Based Service (LBS) data have been increasingly applied to urban and transportation studies due to their comprehensiveness and consistency. However, compared to other LBS data including mobile phone data, GPS and social networking platforms, smart card data collected from public transport users have arguably yet to be fully exploited in urban systems analysis. By using five weekdays of passenger travel transaction data taken from go card – Southeast Queensland’s transit smart card – this paper analyses the spatiotemporal distribution of passenger movement with regard to the land use patterns in Brisbane. Work and residential places for public transport commuters were identified after extracting journeys-to-work patterns. Our results show that the locations of the workplaces identified from the go card data and residential suburbs are largely consistent with those that were marked in the land use map. However, the intensity for some residential locations in terms of population or commuter densities do not match well between the map and those derived from the go card data. This indicates that the misalignment between residential areas and workplaces to a certain extent, shedding light on how enhancements to service management and infrastructure expansion might be undertaken.Keywords: big data, smart card data, travel pattern, land use
Procedia PDF Downloads 28524962 Expounding the Evolution of the Proto-Femme Fatale and Its Correlation with the New Woman: A Close Study of David Mamet's Oleanna
Authors: Silvia Elias
Abstract:
The 'Femme Fatale' figure has become synonymous with a mysterious and seductive woman whose charms captivate her lovers into bonds of irresistible desire, often leading them to compromise or downfall. Originally, a Femme Fatale typically uses her beauty to lead men to their destruction but in modern literature, she represents a direct attack on traditional womanhood and the nuclear family as she refuses to abide by the pillars of mainstream society creating an image of a strong independent woman who defies the control of men and rejects the institution of the family. This research aims at discussing the differences and similarities between the femme fatale and the New Woman and how they are perceived by the audience. There is often confusion between the characteristics that define a New Woman and a Femme Fatale since both women desire independence, challenge typical gender role casting, push against the limits of the patriarchal society and take control of their sexuality. The study of the femme fatale remains appealing in modern times because the fear of gender equality gives life to modern femme fatale versions and post-modern literary works introduce their readers to new versions of the deadly seductress. One that does not fully depend on her looks to destroy men. The idea behind writing this paper was born from reading David Mamet's two-character play Oleanna (1992) and tracing the main female protagonist/antagonist's transformation from a helpless inarticulate girl into a powerful controlling negotiator who knows how to lead a bargain and maintain the upper hand.Keywords: Circe, David, Eve, evolution, feminist, femme fatale, gender, Mamet, new, Odysseus, Oleanna, power, Salome, schema, seduction, temptress, woman
Procedia PDF Downloads 45624961 Pattern Recognition Using Feature Based Die-Map Clustering in the Semiconductor Manufacturing Process
Authors: Seung Hwan Park, Cheng-Sool Park, Jun Seok Kim, Youngji Yoo, Daewoong An, Jun-Geol Baek
Abstract:
Depending on the big data analysis becomes important, yield prediction using data from the semiconductor process is essential. In general, yield prediction and analysis of the causes of the failure are closely related. The purpose of this study is to analyze pattern affects the final test results using a die map based clustering. Many researches have been conducted using die data from the semiconductor test process. However, analysis has limitation as the test data is less directly related to the final test results. Therefore, this study proposes a framework for analysis through clustering using more detailed data than existing die data. This study consists of three phases. In the first phase, die map is created through fail bit data in each sub-area of die. In the second phase, clustering using map data is performed. And the third stage is to find patterns that affect final test result. Finally, the proposed three steps are applied to actual industrial data and experimental results showed the potential field application.Keywords: die-map clustering, feature extraction, pattern recognition, semiconductor manufacturing process
Procedia PDF Downloads 40224960 Spatial Integrity of Seismic Data for Oil and Gas Exploration
Authors: Afiq Juazer Rizal, Siti Zaleha Misnan, M. Zairi M. Yusof
Abstract:
Seismic data is the fundamental tool utilized by exploration companies to determine potential hydrocarbon. However, the importance of seismic trace data will be undermined unless the geo-spatial component of the data is understood. Deriving a proposed well to be drilled from data that has positional ambiguity will jeopardize business decision and millions of dollars’ investment that every oil and gas company would like to avoid. Spatial integrity QC workflow has been introduced in PETRONAS to ensure positional errors within the seismic data are recognized throughout the exploration’s lifecycle from acquisition, processing, and seismic interpretation. This includes, amongst other tests, quantifying that the data is referenced to the appropriate coordinate reference system, survey configuration validation, and geometry loading verification. The direct outcome of the workflow implementation helps improve reliability and integrity of sub-surface geological model produced by geoscientist and provide important input to potential hazard assessment where positional accuracy is crucial. This workflow’s development initiative is part of a bigger geospatial integrity management effort, whereby nearly eighty percent of the oil and gas data are location-dependent.Keywords: oil and gas exploration, PETRONAS, seismic data, spatial integrity QC workflow
Procedia PDF Downloads 22324959 Evaluating Data Maturity in Riyadh's Nonprofit Sector: Insights Using the National Data Maturity Index (NDI)
Authors: Maryam Aloshan, Imam Mohammad Ibn Saud, Ahmad Khudair
Abstract:
This study assesses the data governance maturity of nonprofit organizations in Riyadh, Saudi Arabia, using the National Data Maturity Index (NDI) framework developed by the Saudi Data and Artificial Intelligence Authority (SDAIA). Employing a survey designed around the NDI model, data maturity levels were evaluated across 14 dimensions using a 5-point Likert scale. The results reveal a spectrum of maturity levels among the organizations surveyed: while some medium-sized associations reached the ‘Defined’ stage, others, including large associations, fell within the ‘Absence of Capabilities’ or ‘Building’ phases, with no organizations achieving the advanced ‘Established’ or ‘Pioneering’ levels. This variation suggests an emerging recognition of data governance but underscores the need for targeted interventions to bridge the maturity gap. The findings point to a significant opportunity to elevate data governance capabilities in Saudi nonprofits through customized capacity-building initiatives, including training, mentorship, and best practice sharing. This study contributes valuable insights into the digital transformation journey of the Saudi nonprofit sector, aligning with national goals for data-driven governance and organizational efficiency.Keywords: nonprofit organizations-national data maturity index (NDI), Saudi Arabia- SDAIA, data governance, data maturity
Procedia PDF Downloads 1624958 Single-Cell Visualization with Minimum Volume Embedding
Authors: Zhenqiu Liu
Abstract:
Visualizing the heterogeneity within cell-populations for single-cell RNA-seq data is crucial for studying the functional diversity of a cell. However, because of the high level of noises, outlier, and dropouts, it is very challenging to measure the cell-to-cell similarity (distance), visualize and cluster the data in a low-dimension. Minimum volume embedding (MVE) projects the data into a lower-dimensional space and is a promising tool for data visualization. However, it is computationally inefficient to solve a semi-definite programming (SDP) when the sample size is large. Therefore, it is not applicable to single-cell RNA-seq data with thousands of samples. In this paper, we develop an efficient algorithm with an accelerated proximal gradient method and visualize the single-cell RNA-seq data efficiently. We demonstrate that the proposed approach separates known subpopulations more accurately in single-cell data sets than other existing dimension reduction methods.Keywords: single-cell RNA-seq, minimum volume embedding, visualization, accelerated proximal gradient method
Procedia PDF Downloads 22824957 Cloud Data Security Using Map/Reduce Implementation of Secret Sharing Schemes
Authors: Sara Ibn El Ahrache, Tajje-eddine Rachidi, Hassan Badir, Abderrahmane Sbihi
Abstract:
Recently, there has been increasing confidence for a favorable usage of big data drawn out from the huge amount of information deposited in a cloud computing system. Data kept on such systems can be retrieved through the network at the user’s convenience. However, the data that users send include private information, and therefore, information leakage from these data is now a major social problem. The usage of secret sharing schemes for cloud computing have lately been approved to be relevant in which users deal out their data to several servers. Notably, in a (k,n) threshold scheme, data security is assured if and only if all through the whole life of the secret the opponent cannot compromise more than k of the n servers. In fact, a number of secret sharing algorithms have been suggested to deal with these security issues. In this paper, we present a Mapreduce implementation of Shamir’s secret sharing scheme to increase its performance and to achieve optimal security for cloud data. Different tests were run and through it has been demonstrated the contributions of the proposed approach. These contributions are quite considerable in terms of both security and performance.Keywords: cloud computing, data security, Mapreduce, Shamir's secret sharing
Procedia PDF Downloads 306