Search results for: real estate price
3692 The Presence of Investor Overconfidence in the South African Exchange Traded Fund Market
Authors: Damien Kunjal, Faeezah Peerbhai
Abstract:
Despite the increasing popularity of exchange-traded funds (ETFs), ETF investment choices may not always be rational. Excess trading volume, misevaluations of securities, and excess return volatility present in financial markets can be attributed to the influence of the overconfidence bias. Whilst previous research has explored the overconfidence bias in stock markets; this study focuses on trading in ETF markets. Therefore, the objective of this study is to investigate the presence of investor overconfidence in the South African ETF market. Using vector autoregressive models, the lead-lag relationship between market turnover and the market return is examined for the market of South African ETFs tracking domestic benchmarks and for the market of South African ETFs tracking international benchmarks over the period November 2000 till August 2019. Consistent with the overconfidence hypothesis, a positive relationship between current market turnover and lagged market return is found for both markets, even after controlling for market volatility and cross-sectional dispersion. This relationship holds for both market and individual ETF turnover suggesting that investors are overconfident when trading in South African ETFs tracking domestic benchmarks and South African ETFs tracking international benchmarks since trading activity depends on past market returns. Additionally, using the global recession as a structural break, this study finds that investor overconfidence is more pronounced after the global recession suggesting that investors perceive ETFs as risk-reducing assets due to their diversification benefits. Overall, the results of this study indicate that the overconfidence bias has a significant influence on ETF investment choices, therefore, suggesting that the South African ETF market is inefficient since investors’ decisions are based on their biases. As a result, the effect of investor overconfidence can account for the difference between the fair value of ETFs and its current market price. This finding has implications for policymakers whose responsibility is to promote the efficiency of the South African ETF market as well as ETF investors and traders who trade in the South African ETF market.Keywords: exchange-traded fund, market return, market turnover, overconfidence, trading activity
Procedia PDF Downloads 1673691 Using Knowledge Management and Visualisation Concepts to Improve Patients and Hospitals Staff Workflow
Authors: A. A. AlRasheed, A. Atkins, R. Campion
Abstract:
This paper focuses on using knowledge management and visualisation concepts to improve the patients and hospitals employee’s workflow. Hospitals workflow is a complex and complicated process and poor patient flow can put both patients and a hospital’s reputation at risk, and can threaten the facility’s financial sustainability. Healthcare leaders are under increased pressure to reduce costs while maintaining or increasing patient care standards. In this paper, a framework is proposed to help improving patient experience, staff satisfaction, and operational efficiency across hospitals by using knowledge management based visualisation concepts. This framework is using real-time visibility to track and monitor location and status of patients, staff, rooms, and medical equipment.Keywords: knowledge management, improvements, visualisation, workflow
Procedia PDF Downloads 2693690 Audit and Assurance Program for AI-Based Technologies
Authors: Beatrice Arthur
Abstract:
The rapid development of artificial intelligence (AI) has transformed various industries, enabling faster and more accurate decision-making processes. However, with these advancements come increased risks, including data privacy issues, systemic biases, and challenges related to transparency and accountability. As AI technologies become more integrated into business processes, there is a growing need for comprehensive auditing and assurance frameworks to manage these risks and ensure ethical use. This paper provides a literature review on AI auditing and assurance programs, highlighting the importance of adapting traditional audit methodologies to the complexities of AI-driven systems. Objective: The objective of this review is to explore current AI audit practices and their role in mitigating risks, ensuring accountability, and fostering trust in AI systems. The study aims to provide a structured framework for developing audit programs tailored to AI technologies while also investigating how AI impacts governance, risk management, and regulatory compliance in various sectors. Methodology: This research synthesizes findings from academic publications and industry reports from 2014 to 2024, focusing on the intersection of AI technologies and IT assurance practices. The study employs a qualitative review of existing audit methodologies and frameworks, particularly the COBIT 2019 framework, to understand how audit processes can be aligned with AI governance and compliance standards. The review also considers real-time auditing as an emerging necessity for influencing AI system design during early development stages. Outcomes: Preliminary findings indicate that while AI auditing is still in its infancy, it is rapidly gaining traction as both a risk management strategy and a potential driver of business innovation. Auditors are increasingly being called upon to develop controls that address the ethical and operational risks posed by AI systems. The study highlights the need for continuous monitoring and adaptable audit techniques to handle the dynamic nature of AI technologies. Future Directions: Future research will explore the development of AI-specific audit tools and real-time auditing capabilities that can keep pace with evolving technologies. There is also a need for cross-industry collaboration to establish universal standards for AI auditing, particularly in high-risk sectors like healthcare and finance. Further work will involve engaging with industry practitioners and policymakers to refine the proposed governance and audit frameworks. Funding/Support Acknowledgements: This research is supported by the Information Systems Assurance Management Program at Concordia University of Edmonton.Keywords: AI auditing, assurance, risk management, governance, COBIT 2019, transparency, accountability, machine learning, compliance
Procedia PDF Downloads 263689 Queueing Modeling of M/G/1 Fault Tolerant System with Threshold Recovery and Imperfect Coverage
Authors: Madhu Jain, Rakesh Kumar Meena
Abstract:
This paper investigates a finite M/G/1 fault tolerant multi-component machining system. The system incorporates the features such as standby support, threshold recovery and imperfect coverage make the study closer to real time systems. The performance prediction of M/G/1 fault tolerant system is carried out using recursive approach by treating remaining service time as a supplementary variable. The numerical results are presented to illustrate the computational tractability of analytical results by taking three different service time distributions viz. exponential, 3-stage Erlang and deterministic. Moreover, the cost function is constructed to determine the optimal choice of system descriptors to upgrading the system.Keywords: fault tolerant, machine repair, threshold recovery policy, imperfect coverage, supplementary variable technique
Procedia PDF Downloads 2933688 Experimental Investigation of the Impact of Biosurfactants on Residual-Oil Recovery
Authors: S. V. Ukwungwu, A. J. Abbas, G. G. Nasr
Abstract:
The increasing high price of natural gas and oil with attendant increase in energy demand on world markets in recent years has stimulated interest in recovering residual oil saturation across the globe. In order to meet the energy security, efforts have been made in developing new technologies of enhancing the recovery of oil and gas, utilizing techniques like CO2 flooding, water injection, hydraulic fracturing, surfactant flooding etc. Surfactant flooding however optimizes production but poses risk to the environment due to their toxic nature. Amongst proven records that have utilized other type of bacterial in producing biosurfactants for enhancing oil recovery, this research uses a technique to combine biosurfactants that will achieve a scale of EOR through lowering interfacial tension/contact angle. In this study, three biosurfactants were produced from three Bacillus species from freeze dried cultures using sucrose 3 % (w/v) as their carbon source. Two of these produced biosurfactants were screened with the TEMCO Pendant Drop Image Analysis for reduction in IFT and contact angle. Interfacial tension was greatly reduced from 56.95 mN.m-1 to 1.41 mN.m-1 when biosurfactants in cell-free culture (Bacillus licheniformis) were used compared to 4. 83mN.m-1 cell-free culture of Bacillus subtilis. As a result, cell-free culture of (Bacillus licheniformis) changes the wettability of the biosurfactant treatment for contact angle measurement to more water-wet as the angle decreased from 130.75o to 65.17o. The influence of microbial treatment on crushed rock samples was also observed by qualitative wettability experiments. Treated samples with biosurfactants remained in the aqueous phase, indicating a water-wet system. These results could prove that biosurfactants can effectively change the chemistry of the wetting conditions against diverse surfaces, providing a desirable condition for efficient oil transport in this way serving as a mechanism for EOR. The environmental friendly effect of biosurfactants applications for industrial purposes play important advantages over chemically synthesized surfactants, with various possible structures, low toxicity, eco-friendly and biodegradability.Keywords: bacillus, biosurfactant, enhanced oil recovery, residual oil, wettability
Procedia PDF Downloads 2813687 Micro-Electrical Discharge Machining (µEDM): Effect of the Electrochemical Etching Parameters on the Fabrication of Cylindrical Tungsten Micro-Tools
Authors: Asmae Tafraouti, Yasmina Layouni
Abstract:
The fabrication of cylindrical Tungsten micro-tools with a high aspect ratio is a real challenge because of several constraints that come into during their manufacture. In this paper, we will describe the process used to fabricate these micro-tools. It consists of using electrochemical etching. We will also present the optimal protocol that makes it possible to fabricate micro-tools with a high aspect ratio in a reproducible way. Next, we will show the limit of the experimental parameters chosen to manufacture micro-tools from a wire with an initial diameter of Φ_0=250µm. The protocol used allows obtaining an average diameter of Φ=88µm ±1 µm over a length of L=3.5mm.Keywords: drop-off effect, electrochemical etching, micro-electrical discharge machining, tungsten micro-tools
Procedia PDF Downloads 1923686 Characters of Developing Commercial Employment Sub-Centres and Employment Density in Ahmedabad City
Authors: Bhaumik Patel, Amit Gotecha
Abstract:
Commercial centres of different hierarchy and sizes play a vital role in the growth and development of the city. Economic uncertainty and demand for space leads to more urban sprawl and emerging more commercial spaces. The study was focused on the understanding of various indicators affecting the commercial development that can help to solve many issues related to commercial urban development and can guide for future employment growth centre development, Accessibility, Infrastructure, Planning and development regulations and Market forces. The aim of the study was to review characteristics and identifying employment density of Commercial Employment Sub-centres by achieving objectives Understanding various employment sub-centres, Identifying characteristics and deriving behaviour of employment densities and Evaluating and comparing employment sub-centres for the Ahmedabad city. Commercial employment sub-centres one in old city (Kalupur), second in highly developed commercial (C.G.road-Ashram road) and third in the latest developing commercial area (Prahladnagar) were identified by distance from city centre, Land use diversity, Access to Major roads and Public transport, Population density in proximity, Complimentary land uses in proximity and Land price. Commercial activities were categorised into retail, wholesale and service sector and sub categorised into various activities. From the study, Time period of establishment of the unit is a critical parameter for commercial activity, building height, and land-use diversity. Employment diversity is also one parameter for the commercial centre. The old city has retail, wholesale and trading and higher commercial density concerning units and employment both. Prahladnagar area functioned as commercial due to market pressure and developed as more units rather than a requirement. Employment density is higher in the centre of the city, as far as distance increases from city centre employment density and unit density decreases. Characters of influencing employment density and unit density are distance from city centre, development type, establishment time period, building density, unit density, public transport accessibility and road connectivity.Keywords: commercial employment sub-centres, employment density, employment diversity, unit density
Procedia PDF Downloads 1443685 Analysis and Simulation of TM Fields in Waveguides with Arbitrary Cross-Section Shapes by Means of Evolutionary Equations of Time-Domain Electromagnetic Theory
Authors: Ömer Aktaş, Olga A. Suvorova, Oleg Tretyakov
Abstract:
The boundary value problem on non-canonical and arbitrary shaped contour is solved with a numerically effective method called Analytical Regularization Method (ARM) to calculate propagation parameters. As a result of regularization, the equation of first kind is reduced to the infinite system of the linear algebraic equations of the second kind in the space of L2. This equation can be solved numerically for desired accuracy by using truncation method. The parameters as cut-off wavenumber and cut-off frequency are used in waveguide evolutionary equations of electromagnetic theory in time-domain to illustrate the real-valued TM fields with lossy and lossless media.Keywords: analytical regularization method, electromagnetic theory evolutionary equations of time-domain, TM Field
Procedia PDF Downloads 5013684 OILU Tag: A Projective Invariant Fiducial System
Authors: Youssef Chahir, Messaoud Mostefai, Salah Khodja
Abstract:
This paper presents the development of a 2D visual marker, derived from a recent patented work in the field of numbering systems. The proposed fiducial uses a group of projective invariant straight-line patterns, easily detectable and remotely recognizable. Based on an efficient data coding scheme, the developed marker enables producing a large panel of unique real time identifiers with highly distinguishable patterns. The proposed marker Incorporates simultaneously decimal and binary information, making it readable by both humans and machines. This important feature opens up new opportunities for the development of efficient visual human-machine communication and monitoring protocols. Extensive experiment tests validate the robustness of the marker against acquisition and geometric distortions.Keywords: visual markers, projective invariants, distance map, level sets
Procedia PDF Downloads 1643683 Singularization: A Technique for Protecting Neural Networks
Authors: Robert Poenaru, Mihail Pleşa
Abstract:
In this work, a solution that addresses the protection of pre-trained neural networks is developed: Singularization. This method involves applying permutations to the weight matrices of a pre-trained model, introducing a form of structured noise that obscures the original model’s architecture. These permutations make it difficult for an attacker to reconstruct the original model, even if the permuted weights are obtained. Experimental benchmarks indicate that the application of singularization has a profound impact on model performance, often degrading it to the point where retraining from scratch becomes necessary to recover functionality, which is particularly effective for securing intellectual property in neural networks. Moreover, unlike other approaches, singularization is lightweight and computationally efficient, which makes it well suited for resource-constrained environments. Our experiments also demonstrate that this technique performs efficiently in various image classification tasks, highlighting its broad applicability and practicality in real-world scenarios.Keywords: machine learning, ANE, CNN, security
Procedia PDF Downloads 183682 LaPEA: Language for Preprocessing of Edge Applications in Smart Factory
Authors: Masaki Sakai, Tsuyoshi Nakajima, Kazuya Takahashi
Abstract:
In order to improve the productivity of a factory, it is often the case to create an inference model by collecting and analyzing operational data off-line and then to develop an edge application (EAP) that evaluates the quality of the products or diagnoses machine faults in real-time. To accelerate this development cycle, an edge application framework for the smart factory is proposed, which enables to create and modify EAPs based on prepared inference models. In the framework, the preprocessing component is the key part to make it work. This paper proposes a language for preprocessing of edge applications, called LaPEA, which can flexibly process several sensor data from machines into explanatory variables for an inference model, and proves that it meets the requirements for the preprocessing.Keywords: edge application framework, edgecross, preprocessing language, smart factory
Procedia PDF Downloads 1493681 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring
Authors: Zheng Wang, Zhenhong Li, Jon Mills
Abstract:
Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring
Procedia PDF Downloads 1633680 Biosensor: An Approach towards Sustainable Environment
Authors: Purnima Dhall, Rita Kumar
Abstract:
Introduction: River Yamuna, in the national capital territory (NCT), and also the primary source of drinking water for the city. Delhi discharges about 3,684 MLD of sewage through its 18 drains in to the Yamuna. Water quality monitoring is an important aspect of water management concerning to the pollution control. Public concern and legislation are now a day’s demanding better environmental control. Conventional method for estimating BOD5 has various drawbacks as they are expensive, time-consuming, and require the use of highly trained personnel. Stringent forthcoming regulations on the wastewater have necessitated the urge to develop analytical system, which contribute to greater process efficiency. Biosensors offer the possibility of real time analysis. Methodology: In the present study, a novel rapid method for the determination of biochemical oxygen demand (BOD) has been developed. Using the developed method, the BOD of a sample can be determined within 2 hours as compared to 3-5 days with the standard BOD3-5day assay. Moreover, the test is based on specified consortia instead of undefined seeding material therefore it minimizes the variability among the results. The device is coupled to software which automatically calculates the dilution required, so, the prior dilution of the sample is not required before BOD estimation. The developed BOD-Biosensor makes use of immobilized microorganisms to sense the biochemical oxygen demand of industrial wastewaters having low–moderate–high biodegradability. The method is quick, robust, online and less time consuming. Findings: The results of extensive testing of the developed biosensor on drains demonstrate that the BOD values obtained by the device correlated with conventional BOD values the observed R2 value was 0.995. The reproducibility of the measurements with the BOD biosensor was within a percentage deviation of ±10%. Advantages of developed BOD biosensor • Determines the water pollution quickly in 2 hours of time; • Determines the water pollution of all types of waste water; • Has prolonged shelf life of more than 400 days; • Enhanced repeatability and reproducibility values; • Elimination of COD estimation. Distinctiveness of Technology: • Bio-component: can determine BOD load of all types of waste water; • Immobilization: increased shelf life > 400 days, extended stability and viability; • Software: Reduces manual errors, reduction in estimation time. Conclusion: BiosensorBOD can be used to measure the BOD value of the real wastewater samples. The BOD biosensor showed good reproducibility in the results. This technology is useful in deciding treatment strategies well ahead and so facilitating discharge of properly treated water to common water bodies. The developed technology has been transferred to M/s Forbes Marshall Pvt Ltd, Pune.Keywords: biosensor, biochemical oxygen demand, immobilized, monitoring, Yamuna
Procedia PDF Downloads 2793679 An Ontology for Semantic Enrichment of RFID Systems
Authors: Haitham S. Hamza, Mohamed Maher, Shourok Alaa, Aya Khattab, Hadeal Ismail, Kamilia Hosny
Abstract:
Radio Frequency Identification (RFID) has become a key technology in the margining concept of Internet of Things (IoT). Naturally, business applications would require the deployment of various RFID systems that are developed by different vendors and use various data formats. This heterogeneity poses a real challenge in developing large-scale IoT systems with RFID as integration is becoming very complex and challenging. Semantic integration is a key approach to deal with this challenge. To do so, ontology for RFID systems need to be developed in order to annotated semantically RFID systems, and hence, facilitate their integration. Accordingly, in this paper, we propose ontology for RFID systems. The proposed ontology can be used to semantically enrich RFID systems, and hence, improve their usage and reasoning. The usage of the proposed ontology is explained through a simple scenario in the health care domain.Keywords: RFID, semantic technology, ontology, sparql query language, heterogeneity
Procedia PDF Downloads 4713678 A Text Classification Approach Based on Natural Language Processing and Machine Learning Techniques
Authors: Rim Messaoudi, Nogaye-Gueye Gning, François Azelart
Abstract:
Automatic text classification applies mostly natural language processing (NLP) and other AI-guided techniques to automatically classify text in a faster and more accurate manner. This paper discusses the subject of using predictive maintenance to manage incident tickets inside the sociality. It focuses on proposing a tool that treats and analyses comments and notes written by administrators after resolving an incident ticket. The goal here is to increase the quality of these comments. Additionally, this tool is based on NLP and machine learning techniques to realize the textual analytics of the extracted data. This approach was tested using real data taken from the French National Railways (SNCF) company and was given a high-quality result.Keywords: machine learning, text classification, NLP techniques, semantic representation
Procedia PDF Downloads 1053677 Printed Electronics for Enhanced Monitoring of Organ-on-Chip Culture Media Parameters
Authors: Alejandra Ben-Aissa, Martina Moreno, Luciano Sappia, Paul Lacharmoise, Ana Moya
Abstract:
Organ-on-Chip (OoC) stands out as a highly promising approach for drug testing, presenting a cost-effective and ethically superior alternative to conventional in vivo experiments. These cutting-edge devices emerge from the integration of tissue engineering and microfluidic technology, faithfully replicating the physiological conditions of targeted organs. Consequently, they offer a more precise understanding of drug responses without the ethical concerns associated with animal testing. When addressing the limitations of OoC due to conventional and time-consuming techniques, Lab-On-Chip (LoC) emerge as a disruptive technology capable of providing real-time monitoring without compromising sample integrity. This work develops LoC platforms that can be integrated within OoC platforms to monitor essential culture media parameters, including glucose, oxygen, and pH, facilitating the straightforward exchange of sensing units within a dynamic and controlled environment without disrupting cultures. This approach preserves the experimental setup, minimizes the impact on cells, and enables efficient, prolonged measurement. The LoC system is fabricated following the patented methodology protected by EU patent EP4317957A1. One of the key challenges of integrating sensors in a biocompatible, feasible, robust, and scalable manner is addressed through fully printed sensors, ensuring a customized, cost-effective, and scalable solution. With this technique, sensor reliability is enhanced, providing high sensitivity and selectivity for accurate parameter monitoring. In the present study, LoC is validated measuring a complete culture media. The oxygen sensor provided a measurement range from 0 mgO2/L to 6.3 mgO2/L. The pH sensor demonstrated a measurement range spanning 2 pH units to 9.5 pH units. Additionally, the glucose sensor achieved a measurement range from 0 mM to 11 mM. All the measures were performed with the sensors integrated in the LoC. In conclusion, this study showcases the impactful synergy of OoC technology with LoC systems using fully printed sensors, marking a significant step forward in ethical and effective biomedical research, particularly in drug development. This innovation not only meets current demands but also lays the groundwork for future advancements in precision and customization within scientific exploration. Acknowledgments: This work was financially supported by the Catalan Government through the funding grant ACCIÓ-Eurecat (Project Traça-IMPULSENS).Keywords: organ on chip, lab on chip, real time monitoring, biosensors
Procedia PDF Downloads 243676 Firm's Growth Leading Dimensions of Blockchain Empowered Information Management System: An Empirical Study
Authors: Umang Varshney, Amit Karamchandani, Rohit Kapoor
Abstract:
Practitioners and researchers have realized that Blockchain is not limited to currency. Blockchain as a distributed ledger can ensure a transparent and traceable supply chain. Due to Blockchain-enabled IoTs, a firm’s information management system can now take inputs from other supply chain partners in real-time. This study aims to provide empirical evidence of dimensions responsible for blockchain implemented firm’s growth and highlight how sector (manufacturing or service), state's regulatory environment, and choice of blockchain network affect the blockchain's usefulness. This post-adoption study seeks to validate the findings of pre-adoption studies done on the blockchain. Data will be collected through a survey of managers working in blockchain implemented firms and analyzed through PLS-SEM.Keywords: blockchain, information management system, PLS-SEM, firm's growth
Procedia PDF Downloads 1273675 Modern Imputation Technique for Missing Data in Linear Functional Relationship Model
Authors: Adilah Abdul Ghapor, Yong Zulina Zubairi, Rahmatullah Imon
Abstract:
Missing value problem is common in statistics and has been of interest for years. This article considers two modern techniques in handling missing data for linear functional relationship model (LFRM) namely the Expectation-Maximization (EM) algorithm and Expectation-Maximization with Bootstrapping (EMB) algorithm using three performance indicators; namely the mean absolute error (MAE), root mean square error (RMSE) and estimated biased (EB). In this study, we applied the methods of imputing missing values in the LFRM. Results of the simulation study suggest that EMB algorithm performs much better than EM algorithm in both models. We also illustrate the applicability of the approach in a real data set.Keywords: expectation-maximization, expectation-maximization with bootstrapping, linear functional relationship model, performance indicators
Procedia PDF Downloads 4013674 The Impact of Artificial Intelligence on Legislations and Laws
Authors: Keroles Akram Saed Ghatas
Abstract:
The near future will bring significant changes in modern organizations and management due to the growing role of intangible assets and knowledge workers. The area of copyright, intellectual property, digital (intangible) assets and media redistribution appears to be one of the greatest challenges facing business and society in general and management sciences and organizations in particular. The proposed article examines the views and perceptions of fairness in digital media sharing among Harvard Law School's LL.M.s. Students, based on 50 qualitative interviews and 100 surveys. The researcher took an ethnographic approach to her research and entered the Harvard LL.M. in 2016. at, a Face book group that allows people to connect naturally and attend in-person and private events more easily. After listening to numerous students, the researcher conducted a quantitative survey among 100 respondents to assess respondents' perceptions of fairness in digital file sharing in various contexts (based on media price, its availability, regional licenses, copyright holder status, etc.). to understand better . .). Based on the survey results, the researcher conducted long-term, open-ended and loosely structured ethnographic interviews (50 interviews) to further deepen the understanding of the results. The most important finding of the study is that Harvard lawyers generally support digital piracy in certain contexts, despite having the best possible legal and professional knowledge. Interestingly, they are also more accepting of working for the government than the private sector. The results of this study provide a better understanding of how “fairness” is perceived by the younger generation of lawyers and pave the way for a more rational application of licensing laws.Keywords: cognitive impairments, communication disorders, death penalty, executive function communication disorders, cognitive disorders, capital murder, executive function death penalty, egyptian law absence, justice, political cases piracy, digital sharing, perception of fairness, legal profession
Procedia PDF Downloads 663673 The Role of Climate-Smart Agriculture in the Contribution of Small-Scale Farming towards Ensuring Food Security in South Africa
Authors: Victor O. Abegunde, Melusi Sibanda
Abstract:
There is need for a great deal of attention on small-scale agriculture for livelihood and food security because of the expanding global population. Small-scale agriculture has been identified as a major driving force of agricultural and rural development. However, the high dependence of the sector on natural and climatic resources has made small-scale farmers highly vulnerable to the adverse impact of climatic change thereby necessitating the need for embracing practices or concepts that will help absorb shocks from changes in climatic condition. This study examines the strategic position of small-scale farming in South African agriculture and in ensuring food security in the country, the vulnerability of small-scale agriculture to climate change and the potential of the concept of climate-smart agriculture to tackle the challenge of climate change. The study carried out a systematic review of peer-reviewed literature touching small-scale agriculture, climate change, food security and climate-smart agriculture, employing the realist review method. Findings revealed that increased productivity in the small-scale agricultural sector has a great potential of improving the food security of households in South Africa and reducing dependence on food purchase in a context of high food price inflation. Findings, however, also revealed that climate change affects small-scale subsistence farmers in terms of productivity, food security and family income, categorizing the impact on smallholder livelihoods into three major groups; biological processes, environmental and physical processes and impact on health. Analysis of the literature consistently showed that climate-smart agriculture integrates the benefits of adaptation and resilience to climate change, mitigation, and food security. As a result, farming households adopting climate-smart agriculture will be better off than their counterparts who do not. This study concludes that climate-smart agriculture could be a very good bridge linking small-scale agricultural sector and agricultural productivity and development which could bring about the much needed food security.Keywords: climate change, climate-smart agriculture, food security, small-scale
Procedia PDF Downloads 2423672 Trace Logo: A Notation for Representing Control-Flow of Operational Process
Authors: M. V. Manoj Kumar, Likewin Thomas, Annappa
Abstract:
Process mining research discipline bridges the gap between data mining and business process modeling and analysis, it offers the process-centric and end-to-end methods/techniques for analyzing information of real-world process detailed in operational event-logs. In this paper, we have proposed a notation called trace logo for graphically representing control-flow perspective (order of execution of activities) of process. A trace logo consists of a stack of activity names at each position, sizes of the activity name indicates their frequency in the traces and the total height of the activity depicts the information content of the position. A trace logo created from a set of aligned traces generated using Multiple Trace Alignment technique.Keywords: consensus trace, process mining, multiple trace alignment, trace logo
Procedia PDF Downloads 3513671 Arithmetic Operations in Deterministic P Systems Based on the Weak Rule Priority
Authors: Chinedu Peter, Dashrath Singh
Abstract:
Membrane computing is a computability model which abstracts its structures and functions from the biological cell. The main ingredient of membrane computing is the notion of a membrane structure, which consists of several cell-like membranes recurrently placed inside a unique skin membrane. The emergence of several variants of membrane computing gives rise to the notion of a P system. The paper presents a variant of P systems for arithmetic operations on non-negative integers based on the weak priorities for rule application. Consequently, we obtain deterministic P systems. Two membranes suffice. There are at most four objects for multiplication and five objects for division throughout the computation processes. The model is simple and has a potential for possible extension to non-negative integers and real numbers in general.Keywords: P system, binary operation, determinism, weak rule priority
Procedia PDF Downloads 4473670 Forensics Linguistics and Phonetics: The Analysis of Language to Support Investigations
Authors: Andreas Aceranti, Simonetta Vernocchi, Marco Colorato, Kaoutar Filahi
Abstract:
This study was inspired by the necessity of giving forensic linguistics and phonetics more and more importance and the intention to explore those topics in an attempt to understand what the role of these disciplines really is in investigations of any nature. The goal is to analyze what are the achievements that those subjects have been able to reach, and what contribution they gave to the legal world; the analysis and study of those topics are supported by the recounting of real cases that have included forensic and phonetic linguistics. One of the most relevant cases is that of the Unabomber, an investigation that brought to light the importance and highlighted the importance this matter can have in difficult and time-consuming cases such as the one we have here. We also focus on the areas of expertise of those new branches of applied linguistics, focusing on what is the use of this new discipline in Italy and abroad and showing what could be the possible improvements that the Italian state could apply in order to be able to catch up with countries like Great Britain.Keywords: forensic linguistic, forensic phonetics, investigation, criminalistics
Procedia PDF Downloads 953669 Quantum Mechanics as A Limiting Case of Relativistic Mechanics
Authors: Ahmad Almajid
Abstract:
The idea of unifying quantum mechanics with general relativity is still a dream for many researchers, as physics has only two paths, no more. Einstein's path, which is mainly based on particle mechanics, and the path of Paul Dirac and others, which is based on wave mechanics, the incompatibility of the two approaches is due to the radical difference in the initial assumptions and the mathematical nature of each approach. Logical thinking in modern physics leads us to two problems: - In quantum mechanics, despite its success, the problem of measurement and the problem of wave function interpretation is still obscure. - In special relativity, despite the success of the equivalence of rest-mass and energy, but at the speed of light, the fact that the energy becomes infinite is contrary to logic because the speed of light is not infinite, and the mass of the particle is not infinite too. These contradictions arise from the overlap of relativistic and quantum mechanics in the neighborhood of the speed of light, and in order to solve these problems, one must understand well how to move from relativistic mechanics to quantum mechanics, or rather, to unify them in a way different from Dirac's method, in order to go along with God or Nature, since, as Einstein said, "God doesn't play dice." From De Broglie's hypothesis about wave-particle duality, Léon Brillouin's definition of the new proper time was deduced, and thus the quantum Lorentz factor was obtained. Finally, using the Euler-Lagrange equation, we come up with new equations in quantum mechanics. In this paper, the two problems in modern physics mentioned above are solved; it can be said that this new approach to quantum mechanics will enable us to unify it with general relativity quite simply. If the experiments prove the validity of the results of this research, we will be able in the future to transport the matter at speed close to the speed of light. Finally, this research yielded three important results: 1- Lorentz quantum factor. 2- Planck energy is a limited case of Einstein energy. 3- Real quantum mechanics, in which new equations for quantum mechanics match and exceed Dirac's equations, these equations have been reached in a completely different way from Dirac's method. These equations show that quantum mechanics is a limited case of relativistic mechanics. At the Solvay Conference in 1927, the debate about quantum mechanics between Bohr, Einstein, and others reached its climax, while Bohr suggested that if particles are not observed, they are in a probabilistic state, then Einstein said his famous claim ("God does not play dice"). Thus, Einstein was right, especially when he didn't accept the principle of indeterminacy in quantum theory, although experiments support quantum mechanics. However, the results of our research indicate that God really does not play dice; when the electron disappears, it turns into amicable particles or an elastic medium, according to the above obvious equations. Likewise, Bohr was right also, when he indicated that there must be a science like quantum mechanics to monitor and study the motion of subatomic particles, but the picture in front of him was blurry and not clear, so he resorted to the probabilistic interpretation.Keywords: lorentz quantum factor, new, planck’s energy as a limiting case of einstein’s energy, real quantum mechanics, new equations for quantum mechanics
Procedia PDF Downloads 793668 Nearly Zero Energy Building: Analysis on How End-Users Affect Energy Savings Targets
Authors: Margarida Plana
Abstract:
One of the most important energy challenge of the European policies is the transition to a Net Zero Energy Building (NZEB) model. A NZEB is a new concept of building that has the aim of reducing both the energy consumption and the carbon emissions to nearly zero of the course of a year. To achieve this nearly zero consumption, apart from being buildings with high efficiency levels, the energy consumed by the building has to be produced on-site. This paper is focused on presenting the results of the analysis developed on basis of real projects’ data in order to quantify the impact of end-users behavior. The analysis is focused on how the behavior of building’s occupants can vary the achievement of the energy savings targets and how they can be limited. The results obtained show that on this kind of project, with very high energy performance, is required to limit the end-users interaction with the system operation to be able to reach the targets fixed.Keywords: end-users impacts, energy efficiency, energy savings, NZEB model
Procedia PDF Downloads 3753667 The Impact of an Interactive E-Book on Mathematics Reading and Spatial Ability in Middle School Students
Authors: Abebayehu Yohannes, Hsiu-Ling Chen, Chiu-Chen Chang
Abstract:
Mathematics reading and spatial ability are important learning components in mathematics education. However, many students struggle to understand real-world problems and lack the spatial ability to form internal imagery. To cope with this problem, in this study, an interactive e-book was developed. The result indicated that both groups had a significant increase in the mathematics reading ability test, and a significant difference was observed in the overall mathematics reading score in favor of the experimental group. In addition, the interactive e-book learning mode had significant impacts on students’ spatial ability. It was also found that the richness of content with visual and interactive elements provided in the interactive e-book enhanced students’ satisfaction with the teaching material.Keywords: interactive e-books, spatial ability, mathematics reading, satisfaction, three view
Procedia PDF Downloads 1943666 Adaptive Data Approximations Codec (ADAC) for AI/ML-based Cyber-Physical Systems
Authors: Yong-Kyu Jung
Abstract:
The fast growth in information technology has led to de-mands to access/process data. CPSs heavily depend on the time of hardware/software operations and communication over the network (i.e., real-time/parallel operations in CPSs (e.g., autonomous vehicles). Since data processing is an im-portant means to overcome the issue confronting data management, reducing the gap between the technological-growth and the data-complexity and channel-bandwidth. An adaptive perpetual data approximation method is intro-duced to manage the actual entropy of the digital spectrum. An ADAC implemented as an accelerator and/or apps for servers/smart-connected devices adaptively rescales digital contents (avg.62.8%), data processing/access time/energy, encryption/decryption overheads in AI/ML applications (facial ID/recognition).Keywords: adaptive codec, AI, ML, HPC, cyber-physical, cybersecurity
Procedia PDF Downloads 813665 The Role of Predictive Modeling and Optimization in Enhancing Smart Factory Efficiency
Authors: Slawomir Lasota, Tomasz Kajdanowicz
Abstract:
This research examines the application of predictive modelling and optimization algorithms to improve production efficiency in smart factories. Utilizing gradient boosting and neural networks, the study builds robust KPI estimators to predict production outcomes based on real-time data. Optimization methods, including Bayesian optimization and gradient-based algorithms, identify optimal process configurations that maximize availability, efficiency, and quality KPIs. The paper highlights the modular architecture of a recommender system that integrates predictive models, data visualization, and adaptive automation. Comparative analysis across multiple production processes reveals significant improvements in operational performance, laying the foundation for scalable, self-regulating manufacturing systems.Keywords: predictive modeling, optimization, smart factory, efficiency
Procedia PDF Downloads 93664 Predictive Analysis of the Stock Price Market Trends with Deep Learning
Authors: Suraj Mehrotra
Abstract:
The stock market is a volatile, bustling marketplace that is a cornerstone of economics. It defines whether companies are successful or in spiral. A thorough understanding of it is important - many companies have whole divisions dedicated to analysis of both their stock and of rivaling companies. Linking the world of finance and artificial intelligence (AI), especially the stock market, has been a relatively recent development. Predicting how stocks will do considering all external factors and previous data has always been a human task. With the help of AI, however, machine learning models can help us make more complete predictions in financial trends. Taking a look at the stock market specifically, predicting the open, closing, high, and low prices for the next day is very hard to do. Machine learning makes this task a lot easier. A model that builds upon itself that takes in external factors as weights can predict trends far into the future. When used effectively, new doors can be opened up in the business and finance world, and companies can make better and more complete decisions. This paper explores the various techniques used in the prediction of stock prices, from traditional statistical methods to deep learning and neural networks based approaches, among other methods. It provides a detailed analysis of the techniques and also explores the challenges in predictive analysis. For the accuracy of the testing set, taking a look at four different models - linear regression, neural network, decision tree, and naïve Bayes - on the different stocks, Apple, Google, Tesla, Amazon, United Healthcare, Exxon Mobil, J.P. Morgan & Chase, and Johnson & Johnson, the naïve Bayes model and linear regression models worked best. For the testing set, the naïve Bayes model had the highest accuracy along with the linear regression model, followed by the neural network model and then the decision tree model. The training set had similar results except for the fact that the decision tree model was perfect with complete accuracy in its predictions, which makes sense. This means that the decision tree model likely overfitted the training set when used for the testing set.Keywords: machine learning, testing set, artificial intelligence, stock analysis
Procedia PDF Downloads 973663 Preparation of Papers - Developing a Leukemia Diagnostic System Based on Hybrid Deep Learning Architectures in Actual Clinical Environments
Authors: Skyler Kim
Abstract:
An early diagnosis of leukemia has always been a challenge to doctors and hematologists. On a worldwide basis, it was reported that there were approximately 350,000 new cases in 2012, and diagnosing leukemia was time-consuming and inefficient because of an endemic shortage of flow cytometry equipment in current clinical practice. As the number of medical diagnosis tools increased and a large volume of high-quality data was produced, there was an urgent need for more advanced data analysis methods. One of these methods was the AI approach. This approach has become a major trend in recent years, and several research groups have been working on developing these diagnostic models. However, designing and implementing a leukemia diagnostic system in real clinical environments based on a deep learning approach with larger sets remains complex. Leukemia is a major hematological malignancy that results in mortality and morbidity throughout different ages. We decided to select acute lymphocytic leukemia to develop our diagnostic system since acute lymphocytic leukemia is the most common type of leukemia, accounting for 74% of all children diagnosed with leukemia. The results from this development work can be applied to all other types of leukemia. To develop our model, the Kaggle dataset was used, which consists of 15135 total images, 8491 of these are images of abnormal cells, and 5398 images are normal. In this paper, we design and implement a leukemia diagnostic system in a real clinical environment based on deep learning approaches with larger sets. The proposed diagnostic system has the function of detecting and classifying leukemia. Different from other AI approaches, we explore hybrid architectures to improve the current performance. First, we developed two independent convolutional neural network models: VGG19 and ResNet50. Then, using both VGG19 and ResNet50, we developed a hybrid deep learning architecture employing transfer learning techniques to extract features from each input image. In our approach, fusing the features from specific abstraction layers can be deemed as auxiliary features and lead to further improvement of the classification accuracy. In this approach, features extracted from the lower levels are combined into higher dimension feature maps to help improve the discriminative capability of intermediate features and also overcome the problem of network gradient vanishing or exploding. By comparing VGG19 and ResNet50 and the proposed hybrid model, we concluded that the hybrid model had a significant advantage in accuracy. The detailed results of each model’s performance and their pros and cons will be presented in the conference.Keywords: acute lymphoblastic leukemia, hybrid model, leukemia diagnostic system, machine learning
Procedia PDF Downloads 188