Search results for: critical message
4910 Human Factors Simulation Approach to Analyze Older Drivers’ Performance in Intersections Left-Turn Scenarios
Authors: Yassir AbdelRazig, Eren Ozguven, Ren Moses
Abstract:
While there exists a greater understanding of the differences between the driving behaviors of older and younger drivers, there is still a need to further understand how the two groups perform when attempting to perform complex intersection maneuvers. This paper looks to determine if, and to what extent, these differences exist when drivers encounter permissive left-hand turns, pedestrian traffic, two and four-lane intersections, heavy fog, and night conditions. The study will utilize a driving simulator to develop custom drivable scenarios containing one or more of the previously mentioned conditions. 32 younger and 32 older (+65 years) participants perform driving simulation scenarios and have their velocity, time to the nearest oncoming vehicle, accepted and rejected gaps, etc., recorded. The data collected from the simulator is analyzed via Raff’s method and logistic regression in order to determine and compare the critical gaps values of the two cohorts. Out of the parameters considered for this study, only the age of the driver, their experience (if they are a younger driver), the size of a gap, and the presence of pedestrians on the crosswalk proved significant. The results did not support the hypothesis that older drivers would be significantly more conservative in their critical gaps judgment and acceptance.Keywords: older drivers, simulation, left-turn, human factors
Procedia PDF Downloads 2484909 A Case Study on Barriers in Total Productive Maintenance Implementation in the Abu Dhabi Power Industry
Authors: A. Alseiari, P. Farrell
Abstract:
Maintenance has evolved into an imperative function, and contributes significantly to efficient and effective equipment performance. Total Productive Maintenance (TPM) is an ideal approach to support the development and implementation of operation performance improvement. It systematically aims to understand the function of equipment, the service quality relationship with equipment and the probable critical equipment failure conditions. Implementation of TPM programmes need strategic planning and there has been little research applied in this area within Middle-East power plants. In the power sector of Abu Dhabi, technologically and strategically, the power industry is extremely important, and it thus needs effective and efficient equipment management support. The aim of this paper is to investigate barriers to successful TPM implementation in the Abu Dhabi power industry. The study has been conducted in the context of a leading power company in the UAE. Semi-structured interviews were conducted with 16 employees, including maintenance and operation staff, and senior managers. The findings of this research identified seven key barriers, thus: managerial; organisational; cultural; financial; educational; communications; and auditing. With respect to the understanding of these barriers and obstacles in TPM implementation, the findings can contribute towards improved equipment operations and maintenance in power organisations.Keywords: Abu Dhabi Power Industry, TPM implementation, key barriers, organisational culture, critical success factors
Procedia PDF Downloads 2454908 Lung Icams and Vcam-1 in Innate and Adaptive Immunity to Influenza Infections: Implications for Vaccination Strategies
Authors: S. Kozlovski, S.W. Feigelson, R. Alon
Abstract:
The b2 integrin ligands ICAM-1 ICAM-2 and the endothelial VLA-4 integrin ligand VCAM-1 are constitutively expressed on different lung vessels and on high endothelial venules (HEVs), the main portal for lymphocyte entry from the blood into lung draining lymph nodes. ICAMs are also ubiquitously expressed by many antigen-presenting leukocytes and have been traditionally suggested as critical for the various antigen-specific immune synapses generated by these distinct leukocytes and specific naïve and effector T cells. Loss of both ICAM-1 and ICAM-2 on the lung vasculature reduces the ability to patrol monocytes and Tregs to patrol the lung vasculature at a steady state. Our new findings suggest, however, that in terms of innate leukocyte trafficking into the lung lamina propria, both constitutively expressed and virus-induced vascular VCAM-1 can functionally compensate for the loss of these ICAMs. In a mouse model for influenza infection, neutrophil and NK cell recruitment and clearance of influenza remained normal in mice deficient in both ICAMs. Strikingly, mice deficient in both ICAMs also mounted normal influenza-specific CD8 proliferation and differentiation. In addition, these mice normally combated secondary influenza infection, indicating that the presence of ICAMs on conventional dendritic cells (cDCs) that present viral antigens are not required for immune synapse formation between these APCs and naïve CD8 T cells as previously suggested. Furthermore, long-lasting humoral responses critical for protection from a secondary homosubtypic influenza infection were also normal in mice deficient in both ICAM-1 and ICAM-2. Collectively, our results suggest that the expression of ICAM-1 and ICAM-2 on lung endothelial and epithelial cells, as well as on DCs and B cells, is not critical for the generation of innate or adaptive anti-viral immunity in the lungs. Our findings also suggest that endothelial VCAM-1 can substitute for the functions of vascular ICAMs in leukocyte trafficking into various lung compartments.Keywords: emigration, ICAM-1, lymph nodes, VCAM-1
Procedia PDF Downloads 1284907 An Innovative Approach to Improve Skills of Students in Qatar University Spending in Virtual Class though LMS
Authors: Mohammad Shahid Jamil
Abstract:
In this study we have investigated students’ learning and satisfaction in one of the course offered in the Foundation Program at Qatar University. We implied innovative teaching methodology that emphasizes on enhancing students’ thinking skills, decision making, and problem solving skills. Some interesting results were found which can be used to further improve the teaching methodology. To make sure the full use of technology in Foundation Program at Qatar University has started implementing new ways of teaching Math course by using Blackboard as an innovative interactive tool to support standard teaching such as Discussion board, Virtual class, and Study plan in My Math Lab “MML”. In MML Study Plan is designed in such a way that the student can improve their skills wherever they face difficulties with in their Homework, Quiz or Test. Discussion board and Virtual Class are collaborative learning tools encourages students to engage outside of class time. These tools are useful to share students’ knowledge and learning experiences, promote independent and active learning and they helps students to improve their critical thinking skills through the learning process.Keywords: blackboard, discussion board, critical thinking, active learning, independent learning, problem solving
Procedia PDF Downloads 4284906 Integrating Building Information Modeling into Facilities Management Operations
Authors: Mojtaba Valinejadshoubi, Azin Shakibabarough, Ashutosh Bagchi
Abstract:
Facilities such as residential buildings, office buildings, and hospitals house large density of occupants. Therefore, a low-cost facility management program (FMP) should be used to provide a satisfactory built environment for these occupants. Facility management (FM) has been recently used in building projects as a critical task. It has been effective in reducing operation and maintenance cost of these facilities. Issues of information integration and visualization capabilities are critical for reducing the complexity and cost of FM. Building information modeling (BIM) can be used as a strong visual modeling tool and database in FM. The main objective of this study is to examine the applicability of BIM in the FM process during a building’s operational phase. For this purpose, a seven-storey office building is modeled Autodesk Revit software. Authors integrated the cloud-based environment using a visual programming tool, Dynamo, for the purpose of having a real-time cloud-based communication between the facility managers and the participants involved in the project. An appropriate and effective integrated data source and visual model such as BIM can reduce a building’s operational and maintenance costs by managing the building life cycle properly.Keywords: building information modeling, facility management, operational phase, building life cycle
Procedia PDF Downloads 1554905 Cardiokey: A Binary and Multi-Class Machine Learning Approach to Identify Individuals Using Electrocardiographic Signals on Wearable Devices
Authors: S. Chami, J. Chauvin, T. Demarest, Stan Ng, M. Straus, W. Jahner
Abstract:
Biometrics tools such as fingerprint and iris are widely used in industry to protect critical assets. However, their vulnerability and lack of robustness raise several worries about the protection of highly critical assets. Biometrics based on Electrocardiographic (ECG) signals is a robust identification tool. However, most of the state-of-the-art techniques have worked on clinical signals, which are of high quality and less noisy, extracted from wearable devices like a smartwatch. In this paper, we are presenting a complete machine learning pipeline that identifies people using ECG extracted from an off-person device. An off-person device is a wearable device that is not used in a medical context such as a smartwatch. In addition, one of the main challenges of ECG biometrics is the variability of the ECG of different persons and different situations. To solve this issue, we proposed two different approaches: per person classifier, and one-for-all classifier. The first approach suggests making binary classifier to distinguish one person from others. The second approach suggests a multi-classifier that distinguishes the selected set of individuals from non-selected individuals (others). The preliminary results, the binary classifier obtained a performance 90% in terms of accuracy within a balanced data. The second approach has reported a log loss of 0.05 as a multi-class score.Keywords: biometrics, electrocardiographic, machine learning, signals processing
Procedia PDF Downloads 1424904 Process Safety Evaluation of a Nuclear Power Plant through Virtual Process Hazard Analysis (PHA) using the What-If Technique
Authors: Lormaine Anne Branzuela, Elysa Largo, Julie Marisol Pagalilauan, Neil Concibido, Monet Concepcion Detras
Abstract:
Energy is a necessity both for the people and the country. The demand for energy is continually increasing, but the supply is not doing the same. The reopening of the Bataan Nuclear Power Plant (BNPP) in the Philippines has been circulating in the media for the current time. The general public has been hesitant in accepting the inclusion of nuclear energy in the Philippine energy mix due to perceived unsafe conditions of the plant. This study evaluated the possible operations of a nuclear power plant, which is of the same type as the BNPP, considering the safety of the workers, the public, and the environment using a Process Hazard Analysis (PHA) method. What-If Technique was utilized to identify the hazards and consequences on the operations of the plant, together with the level of risk it entails. Through the brainstorming sessions of the PHA team, it was found that the most critical system on the plant is the primary system. Possible leakages on pipes and equipment due to weakened seals and welds and blockages on coolant path due to fouling were the most common scenarios identified, which further caused the most critical scenario – radioactive leak through sump contamination, nuclear meltdown, and equipment damage and explosion which could result to multiple injuries and fatalities, and environmental impacts.Keywords: process safety management, process hazard analysis, what-If technique, nuclear power plant
Procedia PDF Downloads 2234903 Deployment of a Product Lifecyle Management (PLM) Solution Towards Digital Transformation
Authors: Asmae Chraibi, Rachid Lghoul, Nabil Rhiati
Abstract:
In the era of Industry 4.0, enterprises are increasingly employing digital technologies in order to improve their product development processes. This research focuses on the strategic deployment of Product Lifecycle Management (PLM) solutions during production as a key tracker of traceability and digital transformation activities. The study explores the integration of PLM within a larger organizational framework, examining its impact on product lifecycle efficiency, corporation, and innovation. Through a comprehensive analysis of a real case study from the automotive industry, this project evaluates the critical success factors and challenges associated with implementing PLM solutions for digital transformation. Moreover, it explores the synergic relationship between PLM and emerging technologies such as 3D experience and SOLIDWORKS, elucidating their combined potential in optimizing production workflows and enabling data-driven decision-making. The study's findings provide global approaches for firms looking to embark on a digital transformation journey by implementing PLM technologies. This research contributes to a better understanding of how PLM can be effectively used to foster innovation and competitiveness in the changing landscape of modern industry by shining light on best practices, critical considerations, and potential obstacles.Keywords: product lifecyle management (PLM), industry 4.0, traceability, digital transformation, solution, innovation, 3D experience, SOLIDWORKS
Procedia PDF Downloads 734902 A Cohort Study of Early Cardiologist Consultation by Telemedicine on the Critical Non-STEMI Inpatients
Authors: Wisit Wichitkosoom
Abstract:
Objectives: To find out the more effect of early cardiologist consultation using a simple technology on the diagnosis and early proper management of patients with Non-STEMI at emergency department of district hospitals without cardiologist on site before transferred. Methods: A cohort study was performed in Udonthani general hospital at Udonthani province. From 1 October 2012–30 September 2013 with 892 patients diagnosed with Non-STEMI. All patients mean aged 46.8 years of age who had been transferred because of Non-STEMI diagnosed, over a 12 week period of studied. Patients whose transferred, in addition to receiving proper care, were offered a cardiologist consultation with average time to Udonthani hospital 1.5 hour. The main outcome measure was length of hospital stay, mortality at 3 months, inpatient investigation, and transfer rate to the higher facilitated hospital were also studied. Results: Hospital stay was significantly shorter for those didn’t consult cardiologist (hazard ratio 1.19; approximate 95% CI 1.001 to 1.251; p = 0.039). The 136 cases were transferred to higher facilitated hospital. No statistically significant in overall mortality between the groups (p=0.068). Conclusions: Early cardiologist consultant can reduce length of hospital stay for patients with cardiovascular conditions outside of cardiac center. The new basic technology can apply for the safety patient.Keywords: critical, telemedicine, safety, non STEMI
Procedia PDF Downloads 4184901 Government Big Data Ecosystem: A Systematic Literature Review
Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis
Abstract:
Data that is high in volume, velocity, veracity and comes from a variety of sources is usually generated in all sectors including the government sector. Globally public administrations are pursuing (big) data as new technology and trying to adopt a data-centric architecture for hosting and sharing data. Properly executed, big data and data analytics in the government (big) data ecosystem can be led to data-driven government and have a direct impact on the way policymakers work and citizens interact with governments. In this research paper, we conduct a systematic literature review. The main aims of this paper are to highlight essential aspects of the government (big) data ecosystem and to explore the most critical socio-technical factors that contribute to the successful implementation of government (big) data ecosystem. The essential aspects of government (big) data ecosystem include definition, data types, data lifecycle models, and actors and their roles. We also discuss the potential impact of (big) data in public administration and gaps in the government data ecosystems literature. As this is a new topic, we did not find specific articles on government (big) data ecosystem and therefore focused our research on various relevant areas like humanitarian data, open government data, scientific research data, industry data, etc.Keywords: applications of big data, big data, big data types. big data ecosystem, critical success factors, data-driven government, egovernment, gaps in data ecosystems, government (big) data, literature review, public administration, systematic review
Procedia PDF Downloads 2294900 A Data Driven Approach for the Degradation of a Lithium-Ion Battery Based on Accelerated Life Test
Authors: Alyaa M. Younes, Nermine Harraz, Mohammad H. Elwany
Abstract:
Lithium ion batteries are currently used for many applications including satellites, electric vehicles and mobile electronics. Their ability to store relatively large amount of energy in a limited space make them most appropriate for critical applications. Evaluation of the life of these batteries and their reliability becomes crucial to the systems they support. Reliability of Li-Ion batteries has been mainly considered based on its lifetime. However, another important factor that can be considered critical in many applications such as in electric vehicles is the cycle duration. The present work presents the results of an experimental investigation on the degradation behavior of a Laptop Li-ion battery (type TKV2V) and the effect of applied load on the battery cycle time. The reliability was evaluated using an accelerated life test. Least squares linear regression with median rank estimation was used to estimate the Weibull distribution parameters needed for the reliability functions estimation. The probability density function, failure rate and reliability function under each of the applied loads were evaluated and compared. An inverse power model is introduced that can predict cycle time at any stress level given.Keywords: accelerated life test, inverse power law, lithium-ion battery, reliability evaluation, Weibull distribution
Procedia PDF Downloads 1684899 A Comparative Analysis of Asymmetric Encryption Schemes on Android Messaging Service
Authors: Mabrouka Algherinai, Fatma Karkouri
Abstract:
Today, Short Message Service (SMS) is an important means of communication. SMS is not only used in informal environment for communication and transaction, but it is also used in formal environments such as institutions, organizations, companies, and business world as a tool for communication and transactions. Therefore, there is a need to secure the information that is being transmitted through this medium to ensure security of information both in transit and at rest. But, encryption has been identified as a means to provide security to SMS messages in transit and at rest. Several past researches have proposed and developed several encryption algorithms for SMS and Information Security. This research aims at comparing the performance of common Asymmetric encryption algorithms on SMS security. The research employs the use of three algorithms, namely RSA, McEliece, and RABIN. Several experiments were performed on SMS of various sizes on android mobile device. The experimental results show that each of the three techniques has different key generation, encryption, and decryption times. The efficiency of an algorithm is determined by the time that it takes for encryption, decryption, and key generation. The best algorithm can be chosen based on the least time required for encryption. The obtained results show the least time when McEliece size 4096 is used. RABIN size 4096 gives most time for encryption and so it is the least effective algorithm when considering encryption. Also, the research shows that McEliece size 2048 has the least time for key generation, and hence, it is the best algorithm as relating to key generation. The result of the algorithms also shows that RSA size 1024 is the most preferable algorithm in terms of decryption as it gives the least time for decryption.Keywords: SMS, RSA, McEliece, RABIN
Procedia PDF Downloads 1634898 Experimental Study on the Effect of Storage Conditions on Thermal Hazard of Nitrocellulose
Authors: Hua Chai, Qiangling Duan, Huiqi Cao, Mi Li, Jinhua Sun
Abstract:
Nitrocellulose (NC), a kind of energetic material, has been widely used in the industrial and military fields. However, this material can also cause serious social disasters due to storage conditions. Thermal hazard of nitrocellulose (NC) was experimentally investigated using the CALVET heat flux calorimeter C80, and three kinds of storage conditions were considered in the experiments: (1) drying time, (2) moisture content, (3) cycles. The results showed that the heat flow curves of NC moved to the low-temperature direction firstly and then slightly moved back by increasing the drying hours. Moisture that was responsible for the appearance of small exothermic peaks was proven to be the unfavorable safety factor yet it could increase the onset temperature of the main peak to some extent. And cycles could both lower the onset temperature and the maximum heat flow but enlarged the peak temperature. Besides, relevant kinetic parameters such as the heat of reaction (ΔH) and the activation energy (Ea) were obtained and compared. It was found that all the three conditions could reduce the values of Ea and most of them produced larger reaction heat. In addition, the critical explosion temperature (Tb) of the NC samples were derived. It was clear that not only the drying time but also the cycles would increase the thermal hazard of the NC. Yet, the right amount of water helped to reduce the thermal hazard.Keywords: C80, nitrocellulose, storage conditions, the critical explosion temperature, thermal hazard
Procedia PDF Downloads 1644897 Framework for Explicit Social Justice Nursing Education and Practice: A Constructivist Grounded Theory Research
Authors: Victor Abu
Abstract:
Background: Social justice ideals are considered as the foundation of nursing practice. These ideals are not always clearly integrated into nursing professional standards or curricula. This hinders concerted global nursing agendas for becoming aware of social injustice or engaging in action for social justice to improve the health of individuals and groups. Aim and objectives: The aim was to create an educational framework for empowering nursing students for social justice awareness and action. This purpose was attained by understanding the meaning of social justice, the effect of social injustice, the visibility of social justice learning, and ways of integrating social justice in nursing education and practice. Methods: Critical interpretive methodologies and constructivist grounded theory research designs guided the processes of recruiting nursing students (n = 11) and nurse educators (n = 11) at a London nursing university to participate in interviews and focus groups, which were analysed by coding systems. Findings: Firstly, social justice was described as ethical practices that enable individuals and groups to have good access to health resources. Secondly, social injustice was understood as unfair practices that caused minimal access to resources, social deprivation, and poor health. Thirdly, social justice learning was considered to be invisible in nursing education due to a lack of explicit modules, educator knowledge, and organisational support. Lastly, explicit modules, educating educators, and attracting leaders’ support were suggested as approaches for the visible integration of social justice in nursing education and practice. Discussion: This research proposes approaches for nursing awareness and action for the development of critical active nurse-learner, critical conscious nurse-educator, and servant nurse leader. The framework on Awareness for Social Justice Action (ASJA) created in this research is an approach for empowering nursing students for social justice practices. Conclusion: This research contributes to and advocates for greater nursing scholarship to raise the spotlight on social justice in the profession.Keywords: social justice, nursing practice, nursing education, nursing curriculum, social justice awareness, social justice action, constructivist grounded theory
Procedia PDF Downloads 584896 Video Club as a Pedagogical Tool to Shift Teachers’ Image of the Child
Authors: Allison Tucker, Carolyn Clarke, Erin Keith
Abstract:
Introduction: In education, the determination to uncover privileged practices requires critical reflection to be placed at the center of both pre-service and in-service teacher education. Confronting deficit thinking about children’s abilities and shifting to holding an image of the child as capable and competent is necessary for teachers to engage in responsive pedagogy that meets children where they are in their learning and builds on strengths. This paper explores the ways in which early elementary teachers' perceptions of the assets of children might shift through the pedagogical use of video clubs. Video club is a pedagogical practice whereby teachers record and view short videos with the intended purpose of deepening their practices. The use of video club as a learning tool has been an extensively documented practice. In this study, a video club is used to watch short recordings of playing children to identify the assets of their students. Methodology: The study on which this paper is based asks the question: What are the ways in which teachers’ image of the child and teaching practices evolve through the use of video club focused on the strengths of children demonstrated during play? Using critical reflection, it aims to identify and describe participants’ experiences of examining their personally held image of the child through the pedagogical tool video club, and how that image influences their practices, specifically in implementing play pedagogy. Teachers enrolled in a graduate-level play pedagogy course record and watch videos of their own students as a means to notice and reflect on the learning that happens during play. Using a co-constructed viewing protocol, teachers identify student strengths and consider their pedagogical responses. Video club provides a framework for teachers to critically reflect in action, return to the video to rewatch the children or themselves and discuss their noticings with colleagues. Critical reflection occurs when there is focused attention on identifying the ways in which actions perpetuate or challenge issues of inherent power in education. When the image of the child held by the teacher is from a deficit position and is influenced by hegemonic dimensions of practice, critical reflection is essential in naming and addressing power imbalances, biases, and practices that are harmful to children and become barriers to their thriving. The data is comprised of teacher reflections, analyzed using phenomenology. Phenomenology seeks to understand and appreciate how individuals make sense of their experiences. Teacher reflections are individually read, and researchers determine pools of meaning. Categories are identified by each researcher, after which commonalities are named through a recursive process of returning to the data until no more themes emerge or saturation is reached. Findings: The final analysis and interpretation of the data are forthcoming. However, emergent analysis of the data collected using teacher reflections reveals the ways in which the use of video club grew teachers’ awareness of their image of the child. It shows video club as a promising pedagogical tool when used with in-service teachers to prompt opportunities for play and to challenge deficit thinking about children and their abilities to thrive in learning.Keywords: asset-based teaching, critical reflection, image of the child, video club
Procedia PDF Downloads 1054895 Enhanced Method of Conceptual Sizing of Aircraft Electro-Thermal De-Icing System
Authors: Ahmed Shinkafi, Craig Lawson
Abstract:
There is a great advancement towards the All-Electric Aircraft (AEA) technology. The AEA concept assumes that all aircraft systems will be integrated into one electrical power source in the future. The principle of the electro-thermal system is to transfer the energy required for anti/de-icing to the protected areas in electrical form. However, powering a large aircraft anti-icing system electrically could be quite excessive in cost and system weight. Hence, maximising the anti/de-icing efficiency of the electro-thermal system in order to minimise its power demand has become crucial to electro-thermal de-icing system sizing. In this work, an enhanced methodology has been developed for conceptual sizing of aircraft electro-thermal de-icing System. The work factored those critical terms overlooked in previous studies which were critical to de-icing energy consumption. A case study of a typical large aircraft wing de-icing was used to test and validate the model. The model was used to optimise the system performance by a trade-off between the de-icing peak power and system energy consumption. The optimum melting surface temperatures and energy flux predicted enabled the reduction in the power required for de-icing. The weight penalty associated with electro-thermal anti-icing/de-icing method could be eliminated using this method without under estimating the de-icing power requirement.Keywords: aircraft, de-icing system, electro-thermal, in-flight icing
Procedia PDF Downloads 5174894 Standard Essential Patents for Artificial Intelligence Hardware and the Implications For Intellectual Property Rights
Authors: Wendy de Gomez
Abstract:
Standardization is a critical element in the ability of a society to reduce uncertainty, subjectivity, misrepresentation, and interpretation while simultaneously contributing to innovation. Technological standardization is critical to codify specific operationalization through legal instruments that provide rules of development, expectation, and use. In the current emerging technology landscape Artificial Intelligence (AI) hardware as a general use technology has seen incredible growth as evidenced from AI technology patents between 2012 and 2018 in the United States Patent Trademark Office (USPTO) AI dataset. However, as outlined in the 2023 United States Government National Standards Strategy for Critical and Emerging Technology the codification through standardization of emerging technologies such as AI has not kept pace with its actual technological proliferation. This gap has the potential to cause significant divergent possibilities for the downstream outcomes of AI in both the short and long term. This original empirical research provides an overview of the standardization efforts around AI in different geographies and provides a background to standardization law. It quantifies the longitudinal trend of Artificial Intelligence hardware patents through the USPTO AI dataset. It seeks evidence of existing Standard Essential Patents from these AI hardware patents through a text analysis of the Statement of patent history and the Field of the invention of these patents in Patent Vector and examines their determination as a Standard Essential Patent and their inclusion in existing AI technology standards across the four main AI standards bodies- European Telecommunications Standards Institute (ETSI); International Telecommunication Union (ITU)/ Telecommunication Standardization Sector (-T); Institute of Electrical and Electronics Engineers (IEEE); and the International Organization for Standardization (ISO). Once the analysis is complete the paper will discuss both the theoretical and operational implications of F/Rand Licensing Agreements for the owners of these Standard Essential Patents in the United States Court and Administrative system. It will conclude with an evaluation of how Standard Setting Organizations (SSOs) can work with SEP owners more effectively through various forms of Intellectual Property mechanisms such as patent pools.Keywords: patents, artifical intelligence, standards, F/Rand agreements
Procedia PDF Downloads 884893 A Critical Discourse Analysis of the Impact of the Linguistic Behavior of the Soccer Moroccan Coach in Light of Motivation Theory and Discursive Psychology
Authors: Abdelaadim Bidaoui
Abstract:
As one of the most important linguistic inquiries, the topic of the intertwined relationship between language, the mind, and the world has attracted many scholars. In the fifties, Sapir and Whorf advocated the hypothesis that language shapes our cultural realities as an early attempt to provide answers to this linguistic inquiry. Later, discursive psychology views the linguistic behavior as “a dynamic form of social practice which constructs the social world, individual selves and identity.” (Jorgensen & Phillips 2002, 118). Discursive psychology also considers discourse as a trigger of social action and change. Building on discursive psychology and motivation theory, this paper examines the impact of linguistic behavior of the Moroccan coach Walid Reggragui on the Moroccan team’s exceptional performance in Qatar 2022 Soccer World Cup. The data used in the research is based on interviews conducted by the Moroccan coach prior and during the World Cup. Using a discourse analysis of the linguistic behavior of Reggragui, this paper shows how the linguistic behavior of Reggragui provided support for the three psychological needs: sense of belonging, competence, and autonomy. As any CDA research, this paper uses a triangulated theoretical framework that includes language, cognition and society.Keywords: critical discourse analysis, motivation theory, discursive psychology, linguistic behavior
Procedia PDF Downloads 904892 Essential Factors of Risk Perception Crucial in Efficient Construction Management
Authors: Francis Edum-Fotwe, Tony Thorpe, Charles Afetornu
Abstract:
Risk perception informs the outcome of how issues are responded to in either solving or overcoming a problem or improving a situation. Risk perception is established to be affected by some key factors reflecting in the varying ways in which work is done as well as the level of efficiency achieved. These factors potentially would influence risk perception to different extents. Such that if these factors are said to determine risk perception, how does a change in any affect risk perception. Since the ability to address risk is influenced by risk perception, establishing and developing awareness of that perception should enable construction professionals to make viable decisions. Any act to improve the construction industry cannot be overemphasised, considering its contribution to national development. A survey questionnaire was conducted in Ghana to elicit data that measures the risk perception and the essential factors as well as the necessary demographics of the respondents, who are construction professionals. This study finds out the sensitivity of the critical factors of risk perception. It uses the Relative Importance Index analysis tool to investigate the differential effect of these essential factors on risk perception, such that a slight change in a factor makes a significant change in risk perception, having established that it is influenced by essential factors. The findings can lead to policy formation for employers on the prioritisation factors to undertake to improve the risk perception of employees. Other areas in which this study can be useful in team formation for sensitive and complex projects where efficient risk management is critical.Keywords: construction industry, risk, risk management, risk perception
Procedia PDF Downloads 1434891 Critical Discourse Analysis of Political TV Talk Show of Pakistani Media
Authors: Sumaira Saleem, Sajjad Hussain, Asma Kashif Shahzad, Hina Shaheen
Abstract:
This study aims at exploring the relationship between language and ideology and how such relationships are represented in the analysis of spoken texts, following Van Dijk’s Socio Cognitive Model (2002). In this study, it is tried to show that political Talk shows broadcast by Private TV channels are working apparatuses of ideology and store meanings which are not always obvious for readers. This analysis was about the situation created by Arslan Iftkhar, the son of ex-Chief Justice of Pakistan, Iftikhar Muhammad Chaudry and PTI Chief Imran Khan. Arslan Iftikhar submitted an application against Imran Khan that he is not able to become a member of parliament of Pakistan. In the application, he demanded the documents, which are submitted by Imran Khan at the time of Election to the Election Commission of Pakistan. Murad Ali from PTI also submitted an application against PM Nawaz Sharif to the Election Commission of Pakistan for providing the copies. It also suggests that these talk shows mystify the agency of processes by using various strategies. In other words, critical text analyses reveal how these choices enable speakers to manipulate the realizations of agency and power in the representation of action to produce particular meanings which are not always explicit for all readers.Keywords: ECP, CDA, socio cognitive model, ideology, TV channels, power
Procedia PDF Downloads 7384890 Fuzzy Logic Modeling of Evaluation the Urban Skylines by the Entropy Approach
Authors: Murat Oral, Seda Bostancı, Sadık Ata, Kevser Dincer
Abstract:
When evaluating the aesthetics of cities, an analysis of the urban form development depending on design properties with a variety of factors is performed together with a study of the effects of this appearance on human beings. Different methods are used while making an aesthetical evaluation related to a city. Entropy, in its preliminary meaning, is the mathematical representation of thermodynamic results. Measuring the entropy is related to the distribution of positional figures of a message or information from the probabilities standpoint. In this study, analysis of evaluation the urban skylines by the entropy approach was modelled with Rule-Based Mamdani-Type Fuzzy (RBMTF) modelling technique. Input-output parameters were described by RBMTF if-then rules. Numerical parameters of input and output variables were fuzzificated as linguistic variables: Very Very Low (L1), Very Low (L2), Low (L3), Negative Medium (L4), Medium (L5), Positive Medium (L6), High (L7), Very High (L8) and Very Very High (L9) linguistic classes. The comparison between application data and RBMTF is done by using absolute fraction of variance (R2). The actual values and RBMTF results indicated that RBMTF can be successfully used for the analysis of evaluation the urban skylines by the entropy approach. As a result, RBMTF model has shown satisfying relation with experimental results, which suggests an alternative method to evaluation of the urban skylines by the entropy approach.Keywords: urban skylines, entropy, rule-based Mamdani type, fuzzy logic
Procedia PDF Downloads 2904889 Honneth, Feenberg, and the Redemption of Critical Theory of Technology
Authors: David Schafer
Abstract:
Critical Theory is in sore need of a workable account of technology. It had one in the writings of Herbert Marcuse, or so it seemed until Jürgen Habermas mounted a critique in 'Technology and Science as Ideology' (Habermas, 1970) that decisively put it away. Ever since Marcuse’s work has been regarded outdated – a 'philosophy of consciousness' no longer seriously tenable. But with Marcuse’s view has gone the important insight that technology is no norm-free system (as Habermas portrays it) but can be laden with social bias. Andrew Feenberg is among a few serious scholars who have perceived this problem in post-Habermasian critical theory and has sought to revive a basically Marcusean account of technology. On his view, while so-called ‘technical elements’ that physically make up technologies are neutral with regard to social interests, there is a sense in which we may speak of a normative grammar or ‘technical code’ built-in to technology that can be socially biased in favor of certain groups over others (Feenberg, 2002). According to Feenberg, those perspectives on technology are reified which consider technology only by their technical elements to the neglect of their technical codes. Nevertheless, Feenberg’s account fails to explain what is normatively problematic with such reified views of technology. His plausible claim that they represent false perspectives on technology by itself does not explain how such views may be oppressive, even though Feenberg surely wants to be doing that stronger level of normative theorizing. Perceiving this deficit in his own account of reification, he tries to adopt Habermas’s version of systems-theory to ground his own critical theory of technology (Feenberg, 1999). But this is a curious move in light of Feenberg’s own legitimate critiques of Habermas’s portrayals of technology as reified or ‘norm-free.’ This paper argues that a better foundation may be found in Axel Honneth’s recent text, Freedom’s Right (Honneth, 2014). Though Honneth there says little explicitly about technology, he offers an implicit account of reification formulated in opposition to Habermas’s systems-theoretic approach. On this ‘normative functionalist’ account of reification, social spheres are reified when participants prioritize individualist ideals of freedom (moral and legal freedom) to the neglect of an intersubjective form of freedom-through-recognition that Honneth calls ‘social freedom.’ Such misprioritization is ultimately problematic because it is unsustainable: individual freedom is philosophically and institutionally dependent upon social freedom. The main difficulty in adopting Honneth’s social theory for the purposes of a theory of technology, however, is that the notion of social freedom is predicable only of social institutions, whereas it appears difficult to conceive of technology as an institution. Nevertheless, in light of Feenberg’s work, the idea that technology includes within itself a normative grammar (technical code) takes on much plausibility. To the extent that this normative grammar may be understood by the category of social freedom, Honneth’s dialectical account of the relationship between individual and social forms of freedom provides a more solid basis from which to ground the normative claims of Feenberg’s sociological account of technology than Habermas’s systems theory.Keywords: Habermas, Honneth, technology, Feenberg
Procedia PDF Downloads 1984888 The Impact of the Adittapariyaya Sutta in the Meaning-making of T.S. Eliot’s The Waste Land: A critical Analysis
Authors: Ven Pothupitiye Thilakasiri
Abstract:
The Ādittapariyāya Sutta, also known as the Fire Sermon is an important Buddhist text that addresses the nature of sensual pleasures and attachment through the metaphor of fire. Eliot makes use of this in his epic poem The Waste Land. Though scholars have studied Eliot‘s long poem for traces of eastern philosophy, no scholars have touched upon the idea of how the Adittapariyaya Sutta has enabled the meaning making endeavor of the poem. The present study attempts to address this research gap by undertaking a critical analysis of the Fire Sermon of The Waste Land by undertaking an interdisciplinary study of the poem using two methods—a literary and Buddhist reading methods, namely objective corelative and the three-pillared Buddhist ideas of Anicca (impermanence), Dukkha (suffering) and Anatha (No-self). Thus, the study explores the Ādittapariyāya Sutta’s thematic concerns of impermanence, suffering and no-self within the context of The Waste Land. The setting of the poem symbolizes spiritual desolation and existential crisis. By comparing Sutta‘s teachings with modern existential concerns, which is depicted in T.S. Eliot‘s The Waste Land, the analysis emphasizes the relevance of Buddhist insights to contemporary issues of meaning and disillusioKeywords: Adittapariyaya Sutta, Objective correlative, Eastern Philosophy, Sensual pleasures
Procedia PDF Downloads 274887 Safety-critical Alarming Strategy Based on Statistically Defined Slope Deformation Behaviour Model Case Study: Upright-dipping Highwall in a Coal Mining Area
Authors: Lintang Putra Sadewa, Ilham Prasetya Budhi
Abstract:
Slope monitoring program has now become a mandatory campaign for any open pit mines around the world to operate safely. Utilizing various slope monitoring instruments and strategies, miners are now able to deliver precise decisions in mitigating the risk of slope failures which can be catastrophic. Currently, the most sophisticated slope monitoring technology available is the Slope Stability Radar (SSR), whichcan measure wall deformation in submillimeter accuracy. One of its eminent features is that SSRcan provide a timely warning by automatically raise an alarm when a predetermined rate-of-movement threshold is reached. However, establishing proper alarm thresholds is arguably one of the onerous challenges faced in any slope monitoring program. The difficulty mainly lies in the number of considerations that must be taken when generating a threshold becausean alarm must be effectivethat it should limit the occurrences of false alarms while alsobeing able to capture any real wall deformations. In this sense, experience shows that a site-specific alarm thresholdtendsto produce more reliable results because it considers site distinctive variables. This study will attempt to determinealarming thresholds for safety-critical monitoring based on an empirical model of slope deformation behaviour that is defined statistically fromdeformation data captured by the Slope Stability Radar (SSR). The study area comprises of upright-dipping highwall setting in a coal mining area with intense mining activities, andthe deformation data used for the study were recorded by the SSR throughout the year 2022. The model is site-specific in nature thus, valuable information extracted from the model (e.g., time-to-failure, onset-of-acceleration, and velocity) will be applicable in setting up site-specific alarm thresholds and will give a clear understanding of how deformation trends evolve over the area.Keywords: safety-critical monitoring, alarming strategy, slope deformation behaviour model, coal mining
Procedia PDF Downloads 904886 Studies of Rule Induction by STRIM from the Decision Table with Contaminated Attribute Values from Missing Data and Noise — in the Case of Critical Dataset Size —
Authors: Tetsuro Saeki, Yuichi Kato, Shoutarou Mizuno
Abstract:
STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to realworld data.Keywords: rule induction, decision table, missing data, noise
Procedia PDF Downloads 3964885 A Xenon Mass Gauging through Heat Transfer Modeling for Electric Propulsion Thrusters
Authors: A. Soria-Salinas, M.-P. Zorzano, J. Martín-Torres, J. Sánchez-García-Casarrubios, J.-L. Pérez-Díaz, A. Vakkada-Ramachandran
Abstract:
The current state-of-the-art methods of mass gauging of Electric Propulsion (EP) propellants in microgravity conditions rely on external measurements that are taken at the surface of the tank. The tanks are operated under a constant thermal duty cycle to store the propellant within a pre-defined temperature and pressure range. We demonstrate using computational fluid dynamics (CFD) simulations that the heat-transfer within the pressurized propellant generates temperature and density anisotropies. This challenges the standard mass gauging methods that rely on the use of time changing skin-temperatures and pressures. We observe that the domes of the tanks are prone to be overheated, and that a long time after the heaters of the thermal cycle are switched off, the system reaches a quasi-equilibrium state with a more uniform density. We propose a new gauging method, which we call the Improved PVT method, based on universal physics and thermodynamics principles, existing TRL-9 technology and telemetry data. This method only uses as inputs the temperature and pressure readings of sensors externally attached to the tank. These sensors can operate during the nominal thermal duty cycle. The improved PVT method shows little sensitivity to the pressure sensor drifts which are critical towards the end-of-life of the missions, as well as little sensitivity to systematic temperature errors. The retrieval method has been validated experimentally with CO2 in gas and fluid state in a chamber that operates up to 82 bar within a nominal thermal cycle of 38 °C to 42 °C. The mass gauging error is shown to be lower than 1% the mass at the beginning of life, assuming an initial tank load at 100 bar. In particular, for a pressure of about 70 bar, just below the critical pressure of CO2, the error of the mass gauging in gas phase goes down to 0.1% and for 77 bar, just above the critical point, the error of the mass gauging of the liquid phase is 0.6% of initial tank load. This gauging method improves by a factor of 8 the accuracy of the standard PVT retrievals using look-up tables with tabulated data from the National Institute of Standards and Technology.Keywords: electric propulsion, mass gauging, propellant, PVT, xenon
Procedia PDF Downloads 3454884 Carl von Clausewitz and Foucault on War and Power
Authors: Damian Winczewski
Abstract:
Carl von Clausewitz’s political theory of war was criticized in the 20th century in several ways. It was also the source of many disagreements over readings of its most popular theses. Among them, the reflections of thinkers categorized as part of the broader postmodern current stand out, such as Michael Foucault and his successors, who presented a nuanced and critical approach to strategy theory. Foucault viewed it as part of a broader political–legal discourse of sovereignty rooted in the Middle Ages, which underlies modern biopower. Clausewitz’s theory of strategy underpinned a new humanist discourse rationalizing the phenomenon of war while, in a methodological sense, becoming an epistemic model of how Foucault conceived power strategy. Foucault’s contemporary commentators try to develop his position by arguing the analogy between the discourse prevailing in Clausewitz’s time and the contemporary neoliberal discourse and technological revolution on the battlefield, which create a new order of power. Meanwhile, they recognize that the modern development of strategy was to make Clausewitz’s understanding of war obsolete. However, postmodernists focusing on showy stylistics in their assessments rely on a mythologized narrative about Clausewitz, reducing his theories to a discourse of war as a way for nation-states to conduct foreign policy. In this article, Clausewitz shows that his theory goes much deeper and provides a critical perspective on the relationship between war and politics. The dialectical structure makes it possible to understand war as a historically variable but constantly policy-dependent phenomenon.Keywords: Clausewitz, Foucault, Virilio, postmodernism, war and politics, power
Procedia PDF Downloads 714883 Teacher Professionalisation and Professionalism Discourses in Teacher Unions: A Case Study of New Zealand
Authors: Huidan Niu
Abstract:
Existing research has focused extensively on teachers’ professional experience in education reforms. However, there is a lack of research on the role and influence of teacher unions in education policy. This study aimed to examine how teacher unions frame teacher professionalisation and professionalism discourses. Critical education policy scholarship study was adopted. This study positioned teacher professionalisation and professionalism discourses within their socio-political contexts to explore how the meanings of teacher professionalisation and professionalism are constructed, as well as how teacher unions, as collective actors, shape these discourses. This study examined the development of professionalisation and professionalism discourses in the two main teacher unions in Aotearoa, New Zealand, the New Zealand Educational Institute, TeRiuRoa (NZEI), and the New Zealand Post-Primary Teachers’ Association, TeWehengarua (PPTA). The data were collected from documents and archival material, as well as elite interviews. Twenty-four union leaders, including national presidents, secretaries, executives, and senior union officials, participated in the study. The data analysis followed a grounded theory method: from codes to themes. The findings of the study suggest that the teacher unions, as teachers’ collective (powerful) voices, appeared to highlight tension and confrontation between the teaching profession and governments with respect to the meanings of teacher professionalisation and professionalism.Keywords: critical education policy scholarship, governments, teacher professionalisation, teacher professionalism, teacher unions
Procedia PDF Downloads 1294882 The Role of Artificial Intelligence in Patent Claim Interpretation: Legal Challenges and Opportunities
Authors: Mandeep Saini
Abstract:
The rapid advancement of Artificial Intelligence (AI) is transforming various fields, including intellectual property law. This paper explores the emerging role of AI in interpreting patent claims, a critical and highly specialized area within intellectual property rights. Patent claims define the scope of legal protection granted to an invention, and their precise interpretation is crucial in determining the boundaries of the patent holder's rights. Traditionally, this interpretation has relied heavily on the expertise of patent examiners, legal professionals, and judges. However, the increasing complexity of modern inventions, especially in fields like biotechnology, software, and electronics, poses significant challenges to human interpretation. Introducing AI into patent claim interpretation raises several legal and ethical concerns. This paper addresses critical issues such as the reliability of AI-driven interpretations, the potential for algorithmic bias, and the lack of transparency in AI decision-making processes. It considers the legal implications of relying on AI, particularly regarding accountability for errors and the potential challenges to AI interpretations in court. The paper includes a comparative study of AI-driven patent claim interpretations versus human interpretations across different jurisdictions to provide a comprehensive analysis. This comparison highlights the variations in legal standards and practices, offering insights into how AI could impact the harmonization of international patent laws. The paper proposes policy recommendations for the responsible use of AI in patent law. It suggests legal frameworks that ensure AI tools complement, rather than replace, human expertise in patent claim interpretation. These recommendations aim to balance the benefits of AI with the need for maintaining trust, transparency, and fairness in the legal process. By addressing these critical issues, this research contributes to the ongoing discourse on integrating AI into the legal field, specifically within intellectual property rights. It provides a forward-looking perspective on how AI could reshape patent law, offering both opportunities for innovation and challenges that must be carefully managed to protect the integrity of the legal system.Keywords: artificial intelligence (ai), patent claim interpretation, intellectual property rights, algorithmic bias, natural language processing, patent law harmonization, legal ethics
Procedia PDF Downloads 214881 A Real Time Development Study for Automated Centralized Remote Monitoring System at Royal Belum Forest
Authors: Amri Yusoff, Shahrizuan Shafiril, Ashardi Abas, Norma Che Yusoff
Abstract:
Nowadays, illegal logging has been causing much effect to our forest. Some of it causes a flash flood, avalanche, global warming, and etc. This comprehensibly makes us wonder why, what, and who has made it happened. Often, it already has been too late after we have known the cause of it. Even the Malaysian Royal Belum forest has not been spared from land clearing or illegal activity by the natives although this area has been gazetted as a protected area preserved for future generations. Furthermore, because of its sizeable and wide area, these illegal activities are difficult to monitor and to maintain. A critical action must be called upon to prevent all of these unhealthy activities from recurrence. Therefore, a remote monitoring device must be developed in order to capture critical real-time data such as temperature, humidity, gaseous, fire, and rain detection which indicates the current and preserved natural state and habitat in the forest. Besides, this device location can be detected via GPS by showing the latitudes and longitudes of its current location and then to be transmitted by SMS via GSM system. All of its readings will be sent in real-time for data management and analysis. This result will be benefited to the monitoring bodies or relevant authority in keeping the forest in the natural habitat. Furthermore, this research is to gather a unified data and then will be analysed for its comparison with an existing method.Keywords: remote monitoring system, forest data, GSM, GPS, wireless sensor
Procedia PDF Downloads 417