Search results for: data integration
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26024

Search results for: data integration

25664 Financial Inclusion for Inclusive Growth in an Emerging Economy

Authors: Godwin Chigozie Okpara, William Chimee Nwaoha

Abstract:

The paper set out to stress on how financial inclusion index could be calculated and also investigated the impact of inclusive finance on inclusive growth in an emerging economy. In the light of these objectives, chi-wins method was used to calculate indexes of financial inclusion while co-integration and error correction model were used for evaluation of the impact of financial inclusion on inclusive growth. The result of the analysis revealed that financial inclusion while having a long-run relationship with GDP growth is an insignificant function of the growth of the economy. The speed of adjustment is correctly signed and significant. On the basis of these results, the researchers called for tireless efforts of government and banking sector in promoting financial inclusion in developing countries.

Keywords: chi-wins index, co-integration, error correction model, financial inclusion

Procedia PDF Downloads 635
25663 Multi-omics Integrative Analysis with Genome-Scale Metabolic Model Simulation Reveals Reaction Essentiality data in Human Astrocytes Under the Lipotoxic Effect of Palmitic Acid

Authors: Janneth Gonzalez, Andres Pinzon Velasco, Maria Angarita, Nicolas Mendoza

Abstract:

Astrocytes play an important role in various processes in the brain, including pathological conditions such as neurodegenerative diseases. Recent studies have shown that the increase in saturated fatty acids such as palmitic acid (PA) triggers pro-inflammatory pathways in the brain. The use of synthetic neurosteroids such as tibolone has demonstrated neuro-protective mechanisms. However, there are few studies on the neuro-protective mechanisms of tibolone, especially at the systemic (omic) level. In this study, we performed the integration of multi-omic data (transcriptome and proteome) into a human astrocyte genomic scale metabolic model to study the astrocytic response during palmitate treatment. We evaluated metabolic fluxes in three scenarios (healthy, induced inflammation by PA, and tibolone treatment under PA inflammation). We also use control theory to identify those reactions that control the astrocytic system. Our results suggest that PA generates a modulation of central and secondary metabolism, showing a change in energy source use through inhibition of folate cycle and fatty acid β-oxidation and upregulation of ketone bodies formation.We found 25 metabolic switches under PA-mediated cellular regulation, 9 of which were critical only in the inflammatory scenario but not in the protective tibolone one. Within these reactions, inhibitory, total, and directional coupling profiles were key findings, playing a fundamental role in the (de)regulation in metabolic pathways that increase neurotoxicity and represent potential treatment targets. Finally, this study framework facilitates the understanding of metabolic regulation strategies, andit can be used for in silico exploring the mechanisms of astrocytic cell regulation, directing a more complex future experimental work in neurodegenerative diseases.

Keywords: astrocytes, data integration, palmitic acid, computational model, multi-omics, control theory

Procedia PDF Downloads 106
25662 Partnering with Stakeholders to Secure Digitization of Water

Authors: Sindhu Govardhan, Kenneth G. Crowther

Abstract:

Modernisation of the water sector is leading to increased connectivity and integration of emerging technologies with traditional ones, leading to new security risks. The convergence of Information Technology (IT) with Operation Technology (OT) results in solutions that are spread across larger geographic areas, increasingly consist of interconnected Industrial Internet of Things (IIOT) devices and software, rely on the integration of legacy with modern technologies, use of complex supply chain components leading to complex architectures and communication paths. The result is that multiple parties collectively own and operate these emergent technologies, threat actors find new paths to exploit, and traditional cybersecurity controls are inadequate. Our approach is to explicitly identify and draw data flows that cross trust boundaries between owners and operators of various aspects of these emerging and interconnected technologies. On these data flows, we layer potential attack vectors to create a frame of reference for evaluating possible risks against connected technologies. Finally, we identify where existing controls, mitigations, and other remediations exist across industry partners (e.g., suppliers, product vendors, integrators, water utilities, and regulators). From these, we are able to understand potential gaps in security, the roles in the supply chain that are most likely to effectively remediate those security gaps, and test cases to evaluate and strengthen security across these partners. This informs a “shared responsibility” solution that recognises that security is multi-layered and requires collaboration to be successful. This shared responsibility security framework improves visibility, understanding, and control across the entire supply chain, and particularly for those water utilities that are accountable for safe and continuous operations.

Keywords: cyber security, shared responsibility, IIOT, threat modelling

Procedia PDF Downloads 61
25661 Collaborative Planning and Forecasting

Authors: Neha Asthana, Vishal Krishna Prasad

Abstract:

Collaborative planning and forecasting are the innovative and systematic approaches towards productive integration and assimilation of data synergized into information. The changing and variable market dynamics have persuaded global business chains to incorporate collaborative planning and forecasting as an imperative tool. Thus, it is essential for the supply chains to constantly improvise, update its nature, and mould as per changing global environment.

Keywords: information transfer, forecasting, optimization, supply chain management

Procedia PDF Downloads 416
25660 Ensuring Quality in DevOps Culture

Authors: Sagar Jitendra Mahendrakar

Abstract:

Integrating quality assurance (QA) practices into DevOps culture has become increasingly important in modern software development environments. Collaboration, automation and continuous feedback characterize the seamless integration of DevOps development and operations teams to achieve rapid and reliable software delivery. In this context, quality assurance plays a key role in ensuring that software products meet the highest quality, performance and reliability standards throughout the development life cycle. This brief explores key principles, challenges, and best practices related to quality assurance in a DevOps culture. This emphasizes the importance of quality transfer in the development process, as quality control processes are integrated in every step of the DevOps process. Automation is the cornerstone of DevOps quality assurance, enabling continuous testing, integration and deployment and providing rapid feedback for early problem identification and resolution. In addition, the summary addresses the cultural and organizational challenges of implementing quality assurance in DevOps, emphasizing the need to foster collaboration, break down silos, and promote a culture of continuous improvement. It also discusses the importance of toolchain integration and capability development to support effective QA practices in DevOps environments. Moreover, the abstract discusses the cultural and organizational challenges in implementing QA within DevOps, emphasizing the need for fostering collaboration, breaking down silos, and nurturing a culture of continuous improvement. It also addresses the importance of toolchain integration and skills development to support effective QA practices within DevOps environments. Overall, this collection works at the intersection of QA and DevOps culture, providing insights into how organizations can use DevOps principles to improve software quality, accelerate delivery, and meet the changing demands of today's dynamic software. landscape.

Keywords: quality engineer, devops, automation, tool

Procedia PDF Downloads 35
25659 Response of First Bachelor of Medicine, Bachelor of Surgery (MBBS) Students to Integrated Learning Program

Authors: Raveendranath Veeramani, Parkash Chand, H. Y. Suma, A. Umamageswari

Abstract:

Background and Aims: The aim of this study was to evaluate students’ perception of Integrated Learning Program[ILP]. Settings and Design: A questionnaire was used to survey and evaluate the perceptions of 1styear MBBS students at the Department of Anatomy at our medical college in India. Materials and Methods: The first MBBS Students of Anatomy were involved in the ILP on the Liver and extra hepatic biliary apparatus integrating the Departments of Anatomy, Biochemistry and Hepato-biliary Surgery. The evaluation of the ILP was done by two sets of short questionnaire that had ten items using the Likert five-point grading scale. The data involved both the students’ responses and their grading. Results: A majority of students felt that the ILP was better in as compared to the traditional lecture method of teaching.The integrated teaching method was better at fulfilling learning objectives (128 students, 83%), enabled better understanding (students, 94%), were more interesting (140 students, 90%), ensured that they could score better in exams (115 students, 77%) and involved greater interaction (100 students, 66%), as compared to traditional teaching methods. Most of the students (142 students, 95%) opined that more such sessions should be organized in the future. Conclusions: Responses from students show that the integrated learning session should be incorporated even at first phase of MBBS for selected topics so as to create interest in the medical sciences at the entry level and to make them understand the importance of basic science.

Keywords: integrated learning, students response, vertical integration, horizontal integration

Procedia PDF Downloads 184
25658 Language Rights and the Challenge of National Integration: The Nigerian Experience

Authors: Odewumi Olatunde, Adegun Sunday

Abstract:

Linguistic diversity is seen to complicate attempts to build a stable and cohesive political community. Hence, the challenge of integration is enormous in a multi-ethno-lingual country like Nigeria. In the same vein, justification for minority language rights claims in relation to broader political theories of justice, freedom and democracy cannot be ignored. It is in the light of the fore-going that this paper explores Nigeria’s experiments at language policy and planning(LPP) and the long drawn agitations for self-determination and linguistic freedom by the minority ethnic groups in the polity which has been exacerbated by the National Policy on Education language provisions. The paper succinctly reviews Nigeria’s LPP efforts and its attendant theater of conflicts; explores international attempts at evolving normative principles of freedom and equality for language policy and finally evaluates the position of the Nigerian LPP in the light of evolving international conventions. On this premise, it is concluded that giving a conscientious and honest implementation of the Nigerian language provisions as assessed from their face validity, the nation’s efforts could be exonerated from running afoul of any known civilized values and best practices. It is, therefore, recommended that an effectual and consistent commitment to implementation driven by a renewed political will is what is required for the nation to succeed in this direction.

Keywords: integration, rights, challenge, conventions, policy

Procedia PDF Downloads 398
25657 Integration of an Innovative Complementary Approach Inspired by Clinical Hypnosis into Oncology Care: Nurses’ Perception of Comfort Talk

Authors: Danny Hjeij, Karine Bilodeau, Caroline Arbour

Abstract:

Background: Chemotherapy infusions often lead to a cluster of co-occurring and difficult-to-treat symptoms (nausea, tingling, etc.), which may negatively impact the treatment experience at the outpatient clinic. Although several complementary approaches have shown beneficial effects for chemotherapy-induced symptom management, they are not easily implementable during chemotherapy infusion. In response to this limitation, comfort talk (CT), a simple, fast conversational method inspired by the language principles of clinical hypnosis, is known to optimize the management of symptoms related to antineoplastic treatments. However, the perception of nurses who have had to integrate this practice into their care has never been documented. Study design: A qualitative descriptive study with iterative content analysis was conducted among oncology nurses working in a chemotherapy outpatient clinic who had previous experience with CT. Semi-structured interviews were conducted by phone, using a pre-tested interview guide and a sociodemographic survey to document their perception of CT. The conceptual framework. Results: A total of six nurses (4 women, 2 men) took part in the interviews (N=6). The average age of participants was 49 years (36-61 years). Participants had an average of 24 years of experience (10-38 years) as a nurse, including 14.5 years in oncology (5-32 years). Data saturation (i.e., redundancy of words) was observed around the fifth interview. A sixth interview was conducted as confirmation. Six themes emerged: two addressing contextual and organizational obstacles at the chemotherapy outpatient clinic, and three addressing the added value of CT for oncology nursing care. Specific themes included: 1) the outpatient oncology clinic, a saturated care setting, 2) the keystones that support the integration of CT into care, 3) added value for patients, 4) a positive and rewarding experience for nurses, 5) collateral benefits, and 6) CT an approach to consider during the COVID-19 pandemic. Conclusion: For the first time, this study describes nurses' perception of the integration of CT into the care surrounding the administration of chemotherapy at the outpatient oncology clinic. In summary, contextual and organizational difficulties, as well as the lack of training, are among the main obstacles that could hinder the integration of CT in oncology. Still, the experience was reported mostly as positive. Indeed, nurses saw HC as an added value to patient care and meeting their need for holistic care. HC also appears to be beneficial for patients on several levels (for pain management in particular). Results will be used to inform future knowledge transfer activities related to CT in oncology nursing.

Keywords: cancer, chemotherapy, comfort talk, oncology nursing role

Procedia PDF Downloads 69
25656 Development of Automatic Laser Scanning Measurement Instrument

Authors: Chien-Hung Liu, Yu-Fen Chen

Abstract:

This study used triangular laser probe and three-axial direction mobile platform for surface measurement, programmed it and applied it to real-time analytic statistics of different measured data. This structure was used to design a system integration program: using triangular laser probe for scattering or reflection non-contact measurement, transferring the captured signals to the computer through RS-232, and using RS-485 to control the three-axis platform for a wide range of measurement. The data captured by the laser probe are formed into a 3D surface. This study constructed an optical measurement application program in the concept of visual programming language. First, the signals are transmitted to the computer through RS-232/RS-485, and then the signals are stored and recorded in graphic interface timely. This programming concept analyzes various messages, and makes proper presentation graphs and data processing to provide the users with friendly graphic interfaces and data processing state monitoring, and identifies whether the present data are normal in graphic concept. The major functions of the measurement system developed by this study are thickness measurement, SPC, surface smoothness analysis, and analytical calculation of trend line. A result report can be made and printed promptly. This study measured different heights and surfaces successfully, performed on-line data analysis and processing effectively, and developed a man-machine interface for users to operate.

Keywords: laser probe, non-contact measurement, triangulation measurement principle, statistical process control, labVIEW

Procedia PDF Downloads 352
25655 Human Resource Management in the Innovation Activity in the Republic of Kazakhstan

Authors: A. T. Omarova, G. N. Nakipova

Abstract:

This article discusses the principles of object-oriented human capital development using the technology program. Also the article includes priorities of the strategy of industrial-innovative development of Kazakhstan in conditions of integration activity into the world community. The article shows the tasks of human resource management in the implementation of industrial and innovation development, particularities of Kazakhstan's theory of management staff, as well as due to the specificity of the Kazakhstan authorities. In the article, we have considered the factors which are affecting the people in the organization and also have considered mechanisms of HRM within organization in the conditions of innovative development in Kazakhstan.

Keywords: programming, management of human resources, innovation, investment, innovation process, HRD model, innovative development, integration, management, transformation, economic potential, competitiveness

Procedia PDF Downloads 385
25654 An Econometric Analysis of the Impacts of Inflation on the Economic Growth of South Africa

Authors: Gisele Mah, Paul Saah

Abstract:

The rising rates of inflation are hindering economic growth in developing nations. Hence, this study investigated the effects of inflation rates on the economic growth of South Africa using the secondary time series data from 1987 to 2022. The main objectives of this study were to investigate the long run relationship between inflation and economic growth, and also to determine the causality direction between these two variables. The study utilized the Autoregressive Distributed Lag (ARDL) bounds test of co-integration to investigate whether there is a long-run relationship between inflation and economic growth. The Pairwise Granger causality approach was employed to determine the second objective, which is the direction of causality. The study discovered only one co-integration relationship between our variables and it was between inflation and economic growth. The results showed that there is a negative and significant relationship between inflation and economic growth. There appeared to be a positive and significant relationship between economic growth and exchange rate. The interest rates have shown to be negative and insignificant in explaining economic growth. The study also established that inflation does Granger cause economic growth which is given as GDP. Similarly, the study discovered that inflation Granger causes exchange rates. Therefore, the study recommends that inflation should be decreased in South Africa, in order for economic growth to increase. Contrary, this study recommends that South Africa should increase its exchange rates, in order for economic growth to also increase.

Keywords: inflation rate, economic growth, South Africa, autoregressive distributed lag model

Procedia PDF Downloads 30
25653 Integration of FMEA and Human Factor in the Food Chain Risk Assessment

Authors: Mohsen Shirani, Micaela Demichela

Abstract:

During the last decades, a number of food crises such as Bovine Spongiform Encephalopathy (BSE), Mad-Cow disease, Dioxin in chicken food, Food-and-Mouth Disease (FMD), have certainly inflicted the reliability of the food industry. Consequently, the trend in applying different scientific methods of risk assessment in food safety has obtained more attentions in the academic and practice. However, lack of practical approach considering entire food supply chain is tangible in the academic literature. In this regard, this paper aims to apply risk assessment tool (FMEA) with integration of Human Factor along the entire supply chain of food production and test the method in a case study of Diary production, and analyze its results.

Keywords: FMEA, food supply chain, risk assessment, human factor

Procedia PDF Downloads 421
25652 Is Electricity Consumption Stationary in Turkey?

Authors: Eyup Dogan

Abstract:

The number of research articles analyzing the integration properties of energy variables has rapidly increased in the energy literature for about a decade. The stochastic behaviors of energy variables are worth knowing due to several reasons. For instance, national policies to conserve or promote energy consumption, which should be taken as shocks to energy consumption, will have transitory effects in energy consumption if energy consumption is found to be stationary in one country. Furthermore, it is also important to know the order of integration to employ an appropriate econometric model. Despite being an important subject for applied energy (economics) and having a huge volume of studies, several known limitations still exist with the existing literature. For example, many of the studies use aggregate energy consumption and national level data. In addition, a huge part of the literature is either multi-country studies or solely focusing on the U.S. This is the first study in the literature that considers a form of energy consumption by sectors at sub-national level. This research study aims at investigating unit root properties of electricity consumption for 12 regions of Turkey by four sectors in addition to total electricity consumption for the purpose of filling the mentioned limits in the literature. In this regard, we analyze stationarity properties of 60 cases . Because the use of multiple unit root tests make the results robust and consistent, we apply Dickey-Fuller unit root test based on Generalized Least Squares regression (DFGLS), Phillips-Perron unit root test (PP) and Zivot-Andrews unit root test with one endogenous structural break (ZA). The main finding of this study is that electricity consumption is trend stationary in 7 cases according to DFGLS and PP, whereas it is stationary process in 12 cases when we take into account the structural change by applying ZA. Thus, shocks to electricity consumption have transitory effects in those cases; namely, agriculture in region 1, region 4 and region 7, industrial in region 5, region 8, region 9, region 10 and region 11, business in region 4, region 7 and region 9, total electricity consumption in region 11. Regarding policy implications, policies to decrease or stimulate the use of electricity have a long-run impact on electricity consumption in 80% of cases in Turkey given that 48 cases are non-stationary process. On the other hand, the past behavior of electricity consumption can be used to predict the future behavior of that in 12 cases only.

Keywords: unit root, electricity consumption, sectoral data, subnational data

Procedia PDF Downloads 398
25651 Design and Implementation of a Hardened Cryptographic Coprocessor with 128-bit RISC-V Core

Authors: Yashas Bedre Raghavendra, Pim Vullers

Abstract:

This study presents the design and implementation of an abstract cryptographic coprocessor, leveraging AMBA(Advanced Microcontroller Bus Architecture) protocols - APB (Advanced Peripheral Bus) and AHB (Advanced High-performance Bus), to enable seamless integration with the main CPU(Central processing unit) and enhance the coprocessor’s algorithm flexibility. The primary objective is to create a versatile coprocessor that can execute various cryptographic algorithms, including ECC(Elliptic-curve cryptography), RSA(Rivest–Shamir–Adleman), and AES (Advanced Encryption Standard) while providing a robust and secure solution for modern secure embedded systems. To achieve this goal, the coprocessor is equipped with a tightly coupled memory (TCM) for rapid data access during cryptographic operations. The TCM is placed within the coprocessor, ensuring quick retrieval of critical data and optimizing overall performance. Additionally, the program memory is positioned outside the coprocessor, allowing for easy updates and reconfiguration, which enhances adaptability to future algorithm implementations. Direct links are employed instead of DMA(Direct memory access) for data transfer, ensuring faster communication and reducing complexity. The AMBA-based communication architecture facilitates seamless interaction between the coprocessor and the main CPU, streamlining data flow and ensuring efficient utilization of system resources. The abstract nature of the coprocessor allows for easy integration of new cryptographic algorithms in the future. As the security landscape continues to evolve, the coprocessor can adapt and incorporate emerging algorithms, making it a future-proof solution for cryptographic processing. Furthermore, this study explores the addition of custom instructions into RISC-V ISE (Instruction Set Extension) to enhance cryptographic operations. By incorporating custom instructions specifically tailored for cryptographic algorithms, the coprocessor achieves higher efficiency and reduced cycles per instruction (CPI) compared to traditional instruction sets. The adoption of RISC-V 128-bit architecture significantly reduces the total number of instructions required for complex cryptographic tasks, leading to faster execution times and improved overall performance. Comparisons are made with 32-bit and 64-bit architectures, highlighting the advantages of the 128-bit architecture in terms of reduced instruction count and CPI. In conclusion, the abstract cryptographic coprocessor presented in this study offers significant advantages in terms of algorithm flexibility, security, and integration with the main CPU. By leveraging AMBA protocols and employing direct links for data transfer, the coprocessor achieves high-performance cryptographic operations without compromising system efficiency. With its TCM and external program memory, the coprocessor is capable of securely executing a wide range of cryptographic algorithms. This versatility and adaptability, coupled with the benefits of custom instructions and the 128-bit architecture, make it an invaluable asset for secure embedded systems, meeting the demands of modern cryptographic applications.

Keywords: abstract cryptographic coprocessor, AMBA protocols, ECC, RSA, AES, tightly coupled memory, secure embedded systems, RISC-V ISE, custom instructions, instruction count, cycles per instruction

Procedia PDF Downloads 55
25650 An Exploratory Study on the Integration of Neurodiverse University Students into Mainstream Learning and Their Performance: The Case of the Jones Learning Center

Authors: George Kassar, Phillip A. Cartwright

Abstract:

Based on data collected from The Jones Learning Center (JLC), University of the Ozarks, Arkansas, U.S., this study explores the impact of inclusive classroom practices on neuro-diverse college students’ and their consequent academic performance having participated in integrative therapies designed to support students who are intellectually capable of obtaining a college degree, but who require support for learning challenges owing to disabilities, AD/HD, or ASD. The purpose of this study is two-fold. The first objective is to explore the general process, special techniques, and practices of the (JLC) inclusive program. The second objective is to identify and analyze the effectiveness of the processes, techniques, and practices in supporting the academic performance of enrolled college students with learning disabilities following integration into mainstream university learning. Integrity, transparency, and confidentiality are vital in the research. All questions were shared in advance and confirmed by the concerned management at the JLC. While administering the questionnaire as well as conducted the interviews, the purpose of the study, its scope, aims, and objectives were clearly explained to all participants prior starting the questionnaire / interview. Confidentiality of all participants assured and guaranteed by using encrypted identification of individuals, thus limiting access to data to only the researcher, and storing data in a secure location. Respondents were also informed that their participation in this research is voluntary, and they may withdraw from it at any time prior to submission if they wish. Ethical consent was obtained from the participants before proceeding with videorecording of the interviews. This research uses a mixed methods approach. The research design involves collecting, analyzing, and “mixing” quantitative and qualitative methods and data to enable a research inquiry. The research process is organized based on a five-pillar approach. The first three pillars are focused on testing the first hypothesis (H1) directed toward determining the extent to the academic performance of JLC students did improve after involvement with comprehensive JLC special program. The other two pillars relate to the second hypothesis (H2), which is directed toward determining the extent to which collective and applied knowledge at JLC is distinctive from typical practices in the field. The data collected for research were obtained from three sources: 1) a set of secondary data in the form of Grade Point Average (GPA) received from the registrar, 2) a set of primary data collected throughout structured questionnaire administered to students and alumni at JLC, and 3) another set of primary data collected throughout interviews conducted with staff and educators at JLC. The significance of this study is two folds. First, it validates the effectiveness of the special program at JLC for college-level students who learn differently. Second, it identifies the distinctiveness of the mix of techniques, methods, and practices, including the special individualized and personalized one-on-one approach at JLC.

Keywords: education, neuro-diverse students, program effectiveness, Jones learning center

Procedia PDF Downloads 58
25649 Integration of Technology for Enhanced Learning among Generation Y and Z Nursing Students

Authors: Tarandeep Kaur

Abstract:

Generation Y and Z nursing students have a much higher need for technology-based stimulation than previous generations, as they may find traditional methods of education boring and disinterested. These generations prefer experiential learning and the use of advanced technology for enhanced learning. Therefore, nursing educators must acquire knowledge to make better use of technology and technological tools for instruction. Millennials and generation are digital natives, optimistic, assertive, want engagement, instant feedback, and collaborative approach. The integration of technology and the efficacy of its use can be challenging for nursing educators. The SAMR (substitution, augmentation, modification, and redefinition) model designed and developed by Dr. Ruben Puentedura can help nursing educators to engage their students in different levels of technology integration for effective learning. Nursing educators should understand that technology use in the classroom must be purposeful. The influx of technology in nursing education is ever-changing; therefore, nursing educators have to constantly enhance and develop technical skills to keep up with the emerging technology in the schools as well as hospitals. In the Saskatchewan Collaborative Bachelor of Nursing (SCBSCN) program at Saskatchewan polytechnic, we use technology at various levels using the SAMR model in our program, including low and high-fidelity simulation labs. We are also exploring futuristic options of using virtual reality and gaming in our classrooms as an innovative way to motivate, increase critical thinking, create active learning, provide immediate feedback, improve student retention and create collaboration.

Keywords: generations, nursing, SAMR, technology

Procedia PDF Downloads 93
25648 Against the Idea of Public Power as Free Will

Authors: Donato Vese

Abstract:

According to the common interpretation, in a legal system, public powers are established by law. Exceptions are admitted in an emergency or particular relationship with public power. However, we currently agree that law allows public administration a margin of decision, even in the case of non-discretionary acts. Hence, the administrative decision not exclusively established by law becomes the rule in the ordinary state of things, non-only in state of exception. This paper aims to analyze and discuss different ideas on discretionary power on the Rule of Law and Rechtsstaat. Observing the legal literature in Europe and Nord and South America, discretionary power can be described as follow: it could be considered a margin that law accords to the executive power for political decisions or a choice between different interpretations of vague legal previsions. In essence, this explanation admits for the executive a decision not established by law or anyhow not exclusively established by law. This means that the discretionary power of public administration integrates the law. However, integrating law does not mean to decide according to the law, but it means to integrate law with a decision involving public power. Consequently, discretionary power is essentially free will. In this perspective, also the Rule of Law and the Rechtsstaat are notions explained differently. Recently, we can observe how the European notion of Rechtsstaat is founded on the formal validity of the law; therefore, for this notion, public authority’s decisions not regulated by law represent a problem. Thus, different systems of law integration have been proposed in legal literature, such as values, democracy, reasonableness, and so on. This paper aims to verify how, looking at those integration clauses from a logical viewpoint, integration based on the recourse to the legal system itself does not resolve the problem. The aforementioned integration clauses are legal rules that require hard work to explain the correct meaning of the law; in particular, they introduce dangerous criteria in favor of the political majority. A different notion of public power can be proposed. This notion includes two main features: (a) sovereignty belongs to persons and not the state, and (b) fundamental rights are not grounded but recognized by Constitutions. Hence, public power is a system based on fundamental rights. According to this approach, it can also be defined as the notion of public interest as concrete maximization of fundamental rights enjoyments. Like this, integration of the law, vague or subject to several interpretations, must be done by referring to the system of fundamental individual rights. We can think, for instance, to fundamental rights that are right in an objective view but not legal because not established by law.

Keywords: administrative discretion, free will, fundamental rights, public power, sovereignty

Procedia PDF Downloads 94
25647 Applications of Big Data in Education

Authors: Faisal Kalota

Abstract:

Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners’ needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in education. Additionally, it discusses some of the concerns related to Big Data and current research trends. While Big Data can provide big benefits, it is important that institutions understand their own needs, infrastructure, resources, and limitation before jumping on the Big Data bandwagon.

Keywords: big data, learning analytics, analytics, big data in education, Hadoop

Procedia PDF Downloads 392
25646 Effectiveness of Adopting Software Quality Frameworks in Software Organizations: A Qualitative Review

Authors: Sarah K. Amer, Nagwa Badr, Osman Ibrahim, Ahmed Hamad

Abstract:

This paper surveys the effectiveness of software process quality assurance frameworks, with some focus on Capability Maturity Model Integration (CMMI) - a framework that has become widely adopted in software organizations. The importance of quality improvement in software development, and the differences in the outcomes of quality framework implementation between Middle Eastern and North African (MENA-region) countries and non-MENA-region countries are discussed. The greatest challenges met in the MENA region are identified, with particular focus on Egypt and its rising software development industry.

Keywords: software quality, software process improvement, software development methodologies, capability maturity model integration

Procedia PDF Downloads 328
25645 Integrating Data Mining with Case-Based Reasoning for Diagnosing Sorghum Anthracnose

Authors: Mariamawit T. Belete

Abstract:

Cereal production and marketing are the means of livelihood for millions of households in Ethiopia. However, cereal production is constrained by technical and socio-economic factors. Among the technical factors, cereal crop diseases are the major contributing factors to the low yield. The aim of this research is to develop an integration of data mining and knowledge based system for sorghum anthracnose disease diagnosis that assists agriculture experts and development agents to make timely decisions. Anthracnose diagnosing systems gather information from Melkassa agricultural research center and attempt to score anthracnose severity scale. Empirical research is designed for data exploration, modeling, and confirmatory procedures for testing hypothesis and prediction to draw a sound conclusion. WEKA (Waikato Environment for Knowledge Analysis) was employed for the modeling. Knowledge based system has come across a variety of approaches based on the knowledge representation method; case-based reasoning (CBR) is one of the popular approaches used in knowledge-based system. CBR is a problem solving strategy that uses previous cases to solve new problems. The system utilizes hidden knowledge extracted by employing clustering algorithms, specifically K-means clustering from sampled anthracnose dataset. Clustered cases with centroid value are mapped to jCOLIBRI, and then the integrator application is created using NetBeans with JDK 8.0.2. The important part of a case based reasoning model includes case retrieval; the similarity measuring stage, reuse; which allows domain expert to transfer retrieval case solution to suit for the current case, revise; to test the solution, and retain to store the confirmed solution to the case base for future use. Evaluation of the system was done for both system performance and user acceptance. For testing the prototype, seven test cases were used. Experimental result shows that the system achieves an average precision and recall values of 70% and 83%, respectively. User acceptance testing also performed by involving five domain experts, and an average of 83% acceptance is achieved. Although the result of this study is promising, however, further study should be done an investigation on hybrid approach such as rule based reasoning, and pictorial retrieval process are recommended.

Keywords: sorghum anthracnose, data mining, case based reasoning, integration

Procedia PDF Downloads 68
25644 A Continuous Boundary Value Method of Order 8 for Solving the General Second Order Multipoint Boundary Value Problems

Authors: T. A. Biala

Abstract:

This paper deals with the numerical integration of the general second order multipoint boundary value problems. This has been achieved by the development of a continuous linear multistep method (LMM). The continuous LMM is used to construct a main discrete method to be used with some initial and final methods (also obtained from the continuous LMM) so that they form a discrete analogue of the continuous second order boundary value problems. These methods are used as boundary value methods and adapted to cope with the integration of the general second order multipoint boundary value problems. The convergence, the use and the region of absolute stability of the methods are discussed. Several numerical examples are implemented to elucidate our solution process.

Keywords: linear multistep methods, boundary value methods, second order multipoint boundary value problems, convergence

Procedia PDF Downloads 361
25643 Energy Management System and Interactive Functions of Smart Plug for Smart Home

Authors: Win Thandar Soe, Innocent Mpawenimana, Mathieu Di Fazio, Cécile Belleudy, Aung Ze Ya

Abstract:

Intelligent electronic equipment and automation network is the brain of high-tech energy management systems in critical role of smart homes dominance. Smart home is a technology integration for greater comfort, autonomy, reduced cost, and energy saving as well. These services can be provided to home owners for managing their home appliances locally or remotely and consequently allow them to automate intelligently and responsibly their consumption by individual or collective control systems. In this study, three smart plugs are described and one of them tested on typical household appliances. This article proposes to collect the data from the wireless technology and to extract some smart data for energy management system. This smart data is to quantify for three kinds of load: intermittent load, phantom load and continuous load. Phantom load is a waste power that is one of unnoticed power of each appliance while connected or disconnected to the main. Intermittent load and continuous load take in to consideration the power and using time of home appliances. By analysing the classification of loads, this smart data will be provided to reduce the communication of wireless sensor network for energy management system.

Keywords: energy management, load profile, smart plug, wireless sensor network

Procedia PDF Downloads 259
25642 MLOps Scaling Machine Learning Lifecycle in an Industrial Setting

Authors: Yizhen Zhao, Adam S. Z. Belloum, Goncalo Maia Da Costa, Zhiming Zhao

Abstract:

Machine learning has evolved from an area of academic research to a real-word applied field. This change comes with challenges, gaps and differences exist between common practices in academic environments and the ones in production environments. Following continuous integration, development and delivery practices in software engineering, similar trends have happened in machine learning (ML) systems, called MLOps. In this paper we propose a framework that helps to streamline and introduce best practices that facilitate the ML lifecycle in an industrial setting. This framework can be used as a template that can be customized to implement various machine learning experiment. The proposed framework is modular and can be recomposed to be adapted to various use cases (e.g. data versioning, remote training on cloud). The framework inherits practices from DevOps and introduces other practices that are unique to the machine learning system (e.g.data versioning). Our MLOps practices automate the entire machine learning lifecycle, bridge the gap between development and operation.

Keywords: cloud computing, continuous development, data versioning, DevOps, industrial setting, MLOps

Procedia PDF Downloads 249
25641 Advanced Palliative Aquatics Care Multi-Device AuBento for Symptom and Pain Management by Sensorial Integration and Electromagnetic Fields: A Preliminary Design Study

Authors: J. F. Pollo Gaspary, F. Peron Gaspary, E. M. Simão, R. Concatto Beltrame, G. Orengo de Oliveira, M. S. Ristow Ferreira, J.C. Mairesse Siluk, I. F. Minello, F. dos Santos de Oliveira

Abstract:

Background: Although palliative care policies and services have been developed, research in this area continues to lag. An integrated model of palliative care is suggested, which includes complementary and alternative services aimed at improving the well-being of patients and their families. The palliative aquatics care multi-device (AuBento) uses several electromagnetic techniques to decrease pain and promote well-being through relaxation and interaction among patients, specialists, and family members. Aim: The scope of this paper is to present a preliminary design study of a device capable of exploring the various existing theories on the biomedical application of magnetic fields. This will be achieved by standardizing clinical data collection with sensory integration, and adding new therapeutic options to develop an advanced palliative aquatics care, innovating in symptom and pain management. Methods: The research methodology was based on the Work Package Methodology for the development of projects, separating the activities into seven different Work Packages. The theoretical basis was carried out through an integrative literature review according to the specific objectives of each Work Package and provided a broad analysis, which, together with the multiplicity of proposals and the interdisciplinarity of the research team involved, generated consistent and understandable complex concepts in the biomedical application of magnetic fields for palliative care. Results: Aubento ambience was idealized with restricted electromagnetic exposure (avoiding data collection bias) and sensory integration (allowing relaxation associated with hydrotherapy, music therapy, and chromotherapy or like floating tank). This device has a multipurpose configuration enabling classic or exploratory options on the use of the biomedical application of magnetic fields at the researcher's discretion. Conclusions: Several patients in diverse therapeutic contexts may benefit from the use of magnetic fields or fluids, thus validating the stimuli to clinical research in this area. A device in controlled and multipurpose environments may contribute to standardizing research and exploring new theories. Future research may demonstrate the possible benefits of the aquatics care multi-device AuBento to improve the well-being and symptom control in palliative care patients and their families.

Keywords: advanced palliative aquatics care, magnetic field therapy, medical device, research design

Procedia PDF Downloads 114
25640 Improve Student Performance Prediction Using Majority Vote Ensemble Model for Higher Education

Authors: Wade Ghribi, Abdelmoty M. Ahmed, Ahmed Said Badawy, Belgacem Bouallegue

Abstract:

In higher education institutions, the most pressing priority is to improve student performance and retention. Large volumes of student data are used in Educational Data Mining techniques to find new hidden information from students' learning behavior, particularly to uncover the early symptom of at-risk pupils. On the other hand, data with noise, outliers, and irrelevant information may provide incorrect conclusions. By identifying features of students' data that have the potential to improve performance prediction results, comparing and identifying the most appropriate ensemble learning technique after preprocessing the data, and optimizing the hyperparameters, this paper aims to develop a reliable students' performance prediction model for Higher Education Institutions. Data was gathered from two different systems: a student information system and an e-learning system for undergraduate students in the College of Computer Science of a Saudi Arabian State University. The cases of 4413 students were used in this article. The process includes data collection, data integration, data preprocessing (such as cleaning, normalization, and transformation), feature selection, pattern extraction, and, finally, model optimization and assessment. Random Forest, Bagging, Stacking, Majority Vote, and two types of Boosting techniques, AdaBoost and XGBoost, are ensemble learning approaches, whereas Decision Tree, Support Vector Machine, and Artificial Neural Network are supervised learning techniques. Hyperparameters for ensemble learning systems will be fine-tuned to provide enhanced performance and optimal output. The findings imply that combining features of students' behavior from e-learning and students' information systems using Majority Vote produced better outcomes than the other ensemble techniques.

Keywords: educational data mining, student performance prediction, e-learning, classification, ensemble learning, higher education

Procedia PDF Downloads 91
25639 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction

Authors: Radul Shishkov, Orlin Davchev

Abstract:

The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.

Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction

Procedia PDF Downloads 36
25638 Long-Run Relationship among Tehran Stock Exchange and the GCC Countries Financial Markets, Before and After 2007/2008 Financial Crisis

Authors: Mohammad Hossein Ranjbar, Mahdi Bagheri, B. Shivaraj

Abstract:

This study attempts to investigate the relationship between stock market of Iran and GCC countries stock exchanges. Eight markets were included: the stock market of Iran, Kuwait, Bahrain, Qatar, Saudi Arabia, Dubai, Abu Dhabi and Oman. Daily country market indices were collected from January 2005 to December 2010. The potential time-varying behaviors of long-run stock market relationship among selected markets are tested applying correlation test, Augmented Dick Fuller test (ADF), Bilateral and Multilateral Cointegration (Johansen), and the Granger Causality test. The findings suggest that stock market of Iran is negatively correlated with most of the selected markets in the whole duration. But contemporaneous correlations among the eight selected markets are increased positively in period of financial crises. Bilateral Cointegration between selected markets suggests that there is no integration between Tehran stock exchange and other selected markets. Among other markets, except the stock market of Dubai and Abu Dhabi as a one pair, are not cointegrated in whole, but in duration of financial crises integration between selected markets are increased. Finally, investigation of the casual relationship among eight financial markets suggests there are unidirectional and bidirectional causal relationship among some of stock market indices.

Keywords: financial crises, Middle East, stock market integration, Granger Causality test, ARDL test

Procedia PDF Downloads 377
25637 Structural Challenges of Social Integration of Immigrants in Iran: Investigating the Status of Providing Citizenship and Social Services

Authors: Iman Shabanzadeh

Abstract:

In terms of its geopolitical position, Iran has been one of the main centers of migration movements in the world in recent decades. However, the policy makers' lack of preparation in completing the cycle of social integration of these immigrants, especially the second and third generation, has caused these people to always be prone to leave the country and immigrate to developed and industrialized countries. In this research, the issue of integration of immigrants in Iran from the perspective of four indicators, "Identity Documents", "Access to Banking Services", "Access to Health and Treatment Services" and "Obtaining a Driver's License" will be analyzed. The research method is descriptive-analytical. To collect information, library and document sources in the field of laws and regulations related to immigrants' rights in Iran, semi-structured interviews with experts have been used. The investigations of this study show that none of the residence documents of immigrants in Iran guarantee the full enjoyment of basic citizenship rights for them. In fact, the function of many of these identity documents, such as the census card, educational support card, etc., is only to prevent crossing the border, and none of them guarantee the basic rights of citizenship. Therefore, for many immigrants, the difference between legality and illegality is only in the risk of crossing the border, and this has led to the spread of the habit of illegal presence for them. Despite this, it seems that there is no clear and coherent policy framework around the issue of foreign immigrants in the country. This policy incoherence can be clearly seen in the diversity and plurality of identity and legal documents of the citizens present in the country and the policy maker's lack of planning to integrate and organize the identity of this huge group. Examining the differences and socioeconomic inequalities between immigrants and the native Iranian population shows that immigrants have been poorly integrated into the structures of Iranian society from an economic and social point of view.

Keywords: immigrants, social integration, citizen services, structural inequality

Procedia PDF Downloads 32
25636 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework

Procedia PDF Downloads 268
25635 Development of Sustainable Building Environmental Model (SBEM) in Hong Kong

Authors: Kwok W. Mui, Ling T. Wong, F. Xiao, Chin T. Cheung, Ho C. Yu

Abstract:

This study addresses a concept of the Sustainable Building Environmental Model (SBEM) developed to optimize energy consumption in air conditioning and ventilation (ACV) systems without any deterioration of indoor environmental quality (IEQ). The SBEM incorporates two main components: an adaptive comfort temperature control module (ACT) and a new carbon dioxide demand control module (nDCV). These two modules take an innovative approach to maintain satisfaction of the Indoor Environmental Quality (IEQ) with optimum energy consumption, they provide a rational basis of effective control. A total of 2133 sets of measurement data of indoor air temperature (Ta), relative humidity (Rh) and carbon dioxide concentration (CO2) were conducted in some Hong Kong offices to investigate the potential of integrating the SBEM. A simulation was used to evaluate the dynamic performance of the energy and air conditioning system with the integration of the SBEM in an air-conditioned building. It allows us make a clear picture of the control strategies and performed any pre-tuned of controllers before utilized in real systems. With the integration of SBEM, it was able to save up to 12.3% in simulation and 15% in field measurement of overall electricity consumption, and maintain the average carbon dioxide concentration within 1000ppm and occupant dissatisfaction in 20%.

Keywords: sustainable building environmental model (SBEM), adaptive comfort temperature (ACT), new demand control ventilation (nDCV), energy saving

Procedia PDF Downloads 626