Search results for: knowledge complexity
1173 The Neglected Elements of Implementing Strategic Succession Management in Public Organizations
Authors: François Chiocchio, Mahshid Gharibpour
Abstract:
Regardless of the extent to which succession management is implemented in the private sector, it is still overlooked in the public sector. Traditional succession management is evolving providing a better alignment between business strategies and HR strategies. Succession management brings sustainable effectiveness for succession programs through career path development, knowledge and skill transfer, job retention, as well as high-potential candidates’ empowerment for upcoming vacancies. By way of a systematic literature review, we bring into focus strategic succession management in public organizations and discuss best ways of implementation.
Keywords: Succession management, strategic succession management, public organization, succession management model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16301172 Hybrid Knowledge Approach for Determining Health Care Provider Specialty from Patient Diagnoses
Authors: Erin Lynne Plettenberg, Jeremy Vickery
Abstract:
In an access-control situation, the role of a user determines whether a data request is appropriate. This paper combines vetted web mining and logic modeling to build a lightweight system for determining the role of a health care provider based only on their prior authorized requests. The model identifies provider roles with 100% recall from very little data. This shows the value of vetted web mining in AI systems, and suggests the impact of the ICD classification on medical practice.
Keywords: Ontology, logic modeling, electronic medical records, information extraction, vetted web mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9361171 A Model to Determine Atmospheric Stability and its Correlation with CO Concentration
Authors: Kh. Ashrafi, Gh. A. Hoshyaripour
Abstract:
Atmospheric stability plays the most important role in the transport and dispersion of air pollutants. Different methods are used for stability determination with varying degrees of complexity. Most of these methods are based on the relative magnitude of convective and mechanical turbulence in atmospheric motions. Richardson number, Monin-Obukhov length, Pasquill-Gifford stability classification and Pasquill–Turner stability classification, are the most common parameters and methods. The Pasquill–Turner Method (PTM), which is employed in this study, makes use of observations of wind speed, insolation and the time of day to classify atmospheric stability with distinguishable indices. In this study, a model is presented to determination of atmospheric stability conditions using PTM. As a case study, meteorological data of Mehrabad station in Tehran from 2000 to 2005 is applied to model. Here, three different categories are considered to deduce the pattern of stability conditions. First, the total pattern of stability classification is obtained and results show that atmosphere is 38.77%, 27.26%, 33.97%, at stable, neutral and unstable condition, respectively. It is also observed that days are mostly unstable (66.50%) while nights are mostly stable (72.55%). Second, monthly and seasonal patterns are derived and results indicate that relative frequency of stable conditions decrease during January to June and increase during June to December, while results for unstable conditions are exactly in opposite manner. Autumn is the most stable season with relative frequency of 50.69% for stable condition, whilst, it is 42.79%, 34.38% and 27.08% for winter, summer and spring, respectively. Hourly stability pattern is the third category that points out that unstable condition is dominant from approximately 03-15 GTM and 04-12 GTM for warm and cold seasons, respectively. Finally, correlation between atmospheric stability and CO concentration is achieved.Keywords: Atmospheric stability, Pasquill-Turner classification, convective turbulence, mechanical turbulence, Tehran.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 64531170 Mass Transfer Modeling in a Packed Bed of Palm Kernels under Supercritical Conditions
Authors: I. Norhuda, A. K. Mohd Omar
Abstract:
Studies on gas solid mass transfer using Supercritical fluid CO2 (SC-CO2) in a packed bed of palm kernels was investigated at operating conditions of temperature 50 °C and 70 °C and pressures ranges from 27.6 MPa, 34.5 MPa, 41.4 MPa and 48.3 MPa. The development of mass transfer models requires knowledge of three properties: the diffusion coefficient of the solute, the viscosity and density of the Supercritical fluids (SCF). Matematical model with respect to the dimensionless number of Sherwood (Sh), Schmidt (Sc) and Reynolds (Re) was developed. It was found that the model developed was found to be in good agreement with the experimental data within the system studied.
Keywords: Mass Transfer, Palm Kernel, Supercritical fluid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18161169 The Two Layers of Food Safety and GMOs in the Hungarian Agricultural Law
Authors: Gergely Horváth
Abstract:
The study presents the complexity of food safety dividing it into two layers. Beyond the basic layer of requirements, there is a more demanding higher level linked with quality and purity aspects. It would be important to give special prominence to both layers, given that massive illnesses are caused by foods even though officially licensed. Then the study discusses an exciting safety challenge stemming from the risks of genetically modified organisms (GMOs). Furthermore, it features legal case examples that illustrate how certain liability questions are solved or not yet decided in connection with the production of genetically modified crops. In addition, a special kind of land grabbing, more precisely land grabbing from non-GMO farming systems can also be noticed as well as a new phenomenon eroding food sovereignty. Coexistence, the state where organic, conventional, and GM farming systems are standing alongside each other is an unsuitable experiment that cannot be successful, because of biophysical reasons (such as cross-pollination). Agricultural and environmental lawyers both try to find the optimal solution. Agri-environmental measures are introduced as a special subfield of law maintaining also food safety. The important steps of agri-environmental legislation are aiming at the protection of natural values, the environmental media and strengthening food safety as well, practically the quality of agricultural products intended for human consumption. The major findings of the study focus on searching for the appropriate approach capable of solving the security and safety problems of food production. The most interesting concepts of the Hungarian national and EU food law legislation are analyzed in more detail with descriptive, analytic and comparative methods.
Keywords: Food law, food safety, food security, GMO, agri-environmental measures.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12211168 Comparing Test Equating by Item Response Theory and Raw Score Methods with Small Sample Sizes on a Study of the ARTé: Mecenas Learning Game
Authors: Steven W. Carruthers
Abstract:
The purpose of the present research is to equate two test forms as part of a study to evaluate the educational effectiveness of the ARTé: Mecenas art history learning game. The researcher applied Item Response Theory (IRT) procedures to calculate item, test, and mean-sigma equating parameters. With the sample size n=134, test parameters indicated “good” model fit but low Test Information Functions and more acute than expected equating parameters. Therefore, the researcher applied equipercentile equating and linear equating to raw scores and compared the equated form parameters and effect sizes from each method. Item scaling in IRT enables the researcher to select a subset of well-discriminating items. The mean-sigma step produces a mean-slope adjustment from the anchor items, which was used to scale the score on the new form (Form R) to the reference form (Form Q) scale. In equipercentile equating, scores are adjusted to align the proportion of scores in each quintile segment. Linear equating produces a mean-slope adjustment, which was applied to all core items on the new form. The study followed a quasi-experimental design with purposeful sampling of students enrolled in a college level art history course (n=134) and counterbalancing design to distribute both forms on the pre- and posttests. The Experimental Group (n=82) was asked to play ARTé: Mecenas online and complete Level 4 of the game within a two-week period; 37 participants completed Level 4. Over the same period, the Control Group (n=52) did not play the game. The researcher examined between group differences from post-test scores on test Form Q and Form R by full-factorial Two-Way ANOVA. The raw score analysis indicated a 1.29% direct effect of form, which was statistically non-significant but may be practically significant. The researcher repeated the between group differences analysis with all three equating methods. For the IRT mean-sigma adjusted scores, form had a direct effect of 8.39%. Mean-sigma equating with a small sample may have resulted in inaccurate equating parameters. Equipercentile equating aligned test means and standard deviations, but resultant skewness and kurtosis worsened compared to raw score parameters. Form had a 3.18% direct effect. Linear equating produced the lowest Form effect, approaching 0%. Using linearly equated scores, the researcher conducted an ANCOVA to examine the effect size in terms of prior knowledge. The between group effect size for the Control Group versus Experimental Group participants who completed the game was 14.39% with a 4.77% effect size attributed to pre-test score. Playing and completing the game increased art history knowledge, and individuals with low prior knowledge tended to gain more from pre- to post test. Ultimately, researchers should approach test equating based on their theoretical stance on Classical Test Theory and IRT and the respective assumptions. Regardless of the approach or method, test equating requires a representative sample of sufficient size. With small sample sizes, the application of a range of equating approaches can expose item and test features for review, inform interpretation, and identify paths for improving instruments for future study.Keywords: Effectiveness, equipercentile equating, IRT, learning games, linear equating, mean-sigma equating.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10151167 Method for Concept Labeling Based on Mapping between Ontology and Thesaurus
Authors: Kazuki Sonoda, Masahiro Hori
Abstract:
When designing information systems that deal with large amount of domain knowledge, system designers need to consider ambiguities of labeling termsin domain vocabulary for navigating users in the information space. The goal of this study is to develop a methodology for system designers to label navigation items, taking account of ambiguities stems from synonyms or polysemes of labeling terms. In this paper, we propose a method for concept labeling based on mappings between domain ontology andthesaurus, and report results of an empirical evaluation.Keywords: Concept Labeling, Ontology, Thesaurus, VocabularyProblem
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13391166 Chaotic Dynamics of Cost Overruns in Oil and Gas Megaprojects: A Review
Authors: O. J. Olaniran, P. E. D. Love, D. J. Edwards, O. Olatunji, J. Matthews
Abstract:
Cost overruns are a persistent problem in oil and gas megaprojects. Whilst the extant literature is filled with studies on incidents and causes of cost overruns, underlying theories to explain their emergence in oil and gas megaprojects are few. Yet, a way to contain the syndrome of cost overruns is to understand the bases of ‘how and why’ they occur. Such knowledge will also help to develop pragmatic techniques for better overall management of oil and gas megaprojects. The aim of this paper is to explain the development of cost overruns in hydrocarbon megaprojects through the perspective of chaos theory. The underlying principles of chaos theory and its implications for cost overruns are examined and practical recommendations proposed. In addition, directions for future research in this fertile area provided.Keywords: Chaos theory, oil and gas, cost overruns, megaprojects.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23361165 Aromatic and Medicinal Plants in Morocco: Diversity and Socio-Economic Role
Authors: Mohammed Sghir Taleb
Abstract:
Morocco is characterized by a great richness and diversity in aromatic and medicinal plants and it has an ancestral knowledge in the use of plants for medicinal and cosmetic purposes. In effect, the poverty of riparian, specially, mountain populations have greatly contributed to the development of traditional pharmacopoeia in Morocco. The analysis of the bibliographic data showed that a large number of plants in Morocco are exploited for aromatic and medicinal purposes and several of them are commercialized internationally. However, these potentialities of aromatic and medicinal plants are currently subjected to climate change and strong human pressures: Collecting fruits, agriculture development, harvesting plants, urbanization, overgrazing...Keywords: Aromatic, medicinal, plants, socioeconomy, Morocco.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13341164 The Characterisation of TLC NAND Flash Memory, Leading to a Definable Endurance/Retention Trade-Off
Authors: Sorcha Bennett, Joe Sullivan
Abstract:
Triple-Level Cell (TLC) NAND Flash memory at, and below, 20nm (nanometer) is still largely unexplored by researchers, and with the ever more commonplace existence of Flash in consumer and enterprise applications there is a need for such gaps in knowledge to be filled. At the time of writing, there was little published data or literature on TLC, and more specifically reliability testing, with a further emphasis on both endurance and retention. This paper will give an introduction to NAND Flash memory, followed by an overview of the relevant current research on the reliability of Flash memory, along with the planned future work which will provide results to help characterise the reliability of TLC memory.Keywords: TLC NAND flash memory, reliability, endurance, retention, trade-off, raw flash, patterns.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35141163 Estimation of Natural Frequency of the Bearing System under Periodic Force Based on Principal of Hydrodynamic Mass of Fluid
Authors: M. H. Pol, A. Bidi, A. V. Hoseini
Abstract:
Estimation of natural frequency of structures is very important and isn-t usually calculated simply and sometimes complicated. Lack of knowledge about that caused hard damage and hazardous effects. In this paper, with using from two different models in FEM method and based on hydrodynamic mass of fluids, natural frequency of an especial bearing (Fig. 1) in an electric field (or, a periodic force) is calculated in different stiffness and different geometric. In final, the results of two models and analytical solution are compared.Keywords: Natural frequency of the bearing, Hydrodynamic mass of fluid method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26461162 Versioning OWL Ontologies using Temporal Tags
Authors: Punam Bedi, Sudeep Marwaha
Abstract:
Ontologies play an important role in semantic web applications and are often developed by different groups and continues to evolve over time. The knowledge in ontologies changes very rapidly that make the applications outdated if they continue to use old versions or unstable if they jump to new versions. Temporal frames using frame versioning and slot versioning are used to take care of dynamic nature of the ontologies. The paper proposes new tags and restructured OWL format enabling the applications to work with the old or new version of ontologies. Gene Ontology, a very dynamic ontology, has been used as a case study to explain the OWL Ontology with Temporal Tags.Keywords: Frame and slot Versioning, OWL, OntologyVersioning, Semantic Web.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17241161 Achieving Design-Stage Elemental Cost Planning Accuracy: Case Study of New Zealand
Authors: Johnson Adafin, James O. B. Rotimi, Suzanne Wilkinson, Abimbola O. Windapo
Abstract:
An aspect of client expenditure management that requires attention is the level of accuracy achievable in design-stage elemental cost planning. This has been a major concern for construction clients and practitioners in New Zealand (NZ). Pre-tender estimating inaccuracies are significantly influenced by the level of risk information available to estimators. Proper cost planning activities should ensure the production of a project’s likely construction costs (initial and final), and subsequent cost control activities should prevent unpleasant consequences of cost overruns, disputes and project abandonment. If risks were properly identified and priced at the design stage, observed variance between design-stage elemental cost plans (ECPs) and final tender sums (FTS) (initial contract sums) could be reduced. This study investigates the variations between design-stage ECPs and FTS of construction projects, with a view to identifying risk factors that are responsible for the observed variance. Data were sourced through interviews, and risk factors were identified by using thematic analysis. Access was obtained to project files from the records of study participants (consultant quantity surveyors), and document analysis was employed in complementing the responses from the interviews. Study findings revealed the discrepancies between ECPs and FTS in the region of -14% and +16%. It is opined in this study that the identified risk factors were responsible for the variability observed. The values obtained from the analysis would enable greater accuracy in the forecast of FTS by Quantity Surveyors. Further, whilst inherent risks in construction project developments are observed globally, these findings have important ramifications for construction projects by expanding existing knowledge on what is needed for reasonable budgetary performance and successful delivery of construction projects. The findings contribute significantly to the study by providing quantitative confirmation to justify the theoretical conclusions generated in the literature from around the world. This therefore adds to and consolidates existing knowledge.
Keywords: Accuracy, design-stage, elemental cost plan, final tender sum, New Zealand.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18041160 A Robust Method for Finding Nearest-Neighbor using Hexagon Cells
Authors: Ahmad Attiq Al-Ogaibi, Ahmad Sharieh, Moh’d Belal Al-Zoubi, R. Bremananth
Abstract:
In pattern clustering, nearest neighborhood point computation is a challenging issue for many applications in the area of research such as Remote Sensing, Computer Vision, Pattern Recognition and Statistical Imaging. Nearest neighborhood computation is an essential computation for providing sufficient classification among the volume of pixels (voxels) in order to localize the active-region-of-interests (AROI). Furthermore, it is needed to compute spatial metric relationships of diverse area of imaging based on the applications of pattern recognition. In this paper, we propose a new methodology for finding the nearest neighbor point, depending on making a virtually grid of a hexagon cells, then locate every point beneath them. An algorithm is suggested for minimizing the computation and increasing the turnaround time of the process. The nearest neighbor query points Φ are fetched by seeking fashion of hexagon holistic. Seeking will be repeated until an AROI Φ is to be expected. If any point Υ is located then searching starts in the nearest hexagons in a circular way. The First hexagon is considered be level 0 (L0) and the surrounded hexagons is level 1 (L1). If Υ is located in L1, then search starts in the next level (L2) to ensure that Υ is the nearest neighbor for Φ. Based on the result and experimental results, we found that the proposed method has an advantage over the traditional methods in terms of minimizing the time complexity required for searching the neighbors, in turn, efficiency of classification will be improved sufficiently.
Keywords: Hexagon cells, k-nearest neighbors, Nearest Neighbor, Pattern recognition, Query pattern, Virtually grid
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28021159 Linguistic Phenomena in Men and Women - TOT, FOK, Verbal Fluency
Authors: Ewa Szepietowska, Barbara Gawda, Agnieszka Gawda
Abstract:
The aim of this study is to describe the differences between women and men in the phenomena of feeling of knowing/know (FOK), tip of the tongue (TOT), and verbal fluency. Two studies are presented. The first included a group of 60 participants and focused on the analysis of FOK and TOT in men and women. The second study described the performance of 302 participants in verbal fluency tasks. Both studies showed that sex is not a significant predictor of linguistic abilities. Rather, the main factors influencing one’s linguistic ability were Vocabulary and education. This study enriches the knowledge on mechanisms of memory and verbal production.
Keywords: Feeling of knowing, Tip of the tongue, Verbal fluency, Sex differences.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25281158 Mining Educational Data to Analyze the Student Motivation Behavior
Authors: Kunyanuth Kularbphettong, Cholticha Tongsiri
Abstract:
The purpose of this research aims to discover the knowledge for analysis student motivation behavior on e-Learning based on Data Mining Techniques, in case of the Information Technology for Communication and Learning Course at Suan Sunandha Rajabhat University. The data mining techniques was applied in this research including association rules, classification techniques. The results showed that using data mining technique can indicate the important variables that influence the student motivation behavior on e-Learning.Keywords: association rule mining, classification techniques, e- Learning, Moodle log Motivation Behavior
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30931157 On the Constructivist Teaching of Extensive Reading for English Majors
Authors: Haiyan Wang
Abstract:
Constructivism, the latest teaching and learning theory in western countries which is based on the premise that cognition (learning) is the result of "mental construction", lays emphasis on the learner's active learning. Guided by constructivism, this thesis discusses the teaching plan and its application in extensive reading course. In extensive reading classroom, emphasis should be laid on the activation of students' prior knowledge, grasping the skills of fast reading and the combination of reading and writing to check extracurricular reading. With three factors supplementing each other, students' English reading ability can be improved effectively.
Keywords: Constructivism, extensive reading, constructivist teaching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36081156 Perceived Quality of Regional Products in MS Region
Authors: M. Stoklasa, H. Starzyczna, K. Matusinska
Abstract:
This article deals with the perceived quality of regional products in the Moravian-Silesian region in the Czech Republic. Research was focused on finding out what do consumers perceive as a quality product and what characteristics make a quality product. The data were obtained by questionnaire survey andanalysed by IBM SPSS. From the thousands of respondents the representative sample of 719 for MS region was created based on demographic factors of gender, age, education and income. The research analysis disclosed that consumers in MS region are still price oriented and that the preference of quality over price does not depend on regional brand knowledge.
Keywords: Regional brands, quality products, characteristics of quality, quality over price.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18411155 A Spatial Information Network Traffic Prediction Method Based on Hybrid Model
Authors: Jingling Li, Yi Zhang, Wei Liang, Tao Cui, Jun Li
Abstract:
Compared with terrestrial network, the traffic of spatial information network has both self-similarity and short correlation characteristics. By studying its traffic prediction method, the resource utilization of spatial information network can be improved, and the method can provide an important basis for traffic planning of a spatial information network. In this paper, considering the accuracy and complexity of the algorithm, the spatial information network traffic is decomposed into approximate component with long correlation and detail component with short correlation, and a time series hybrid prediction model based on wavelet decomposition is proposed to predict the spatial network traffic. Firstly, the original traffic data are decomposed to approximate components and detail components by using wavelet decomposition algorithm. According to the autocorrelation and partial correlation smearing and truncation characteristics of each component, the corresponding model (AR/MA/ARMA) of each detail component can be directly established, while the type of approximate component modeling can be established by ARIMA model after smoothing. Finally, the prediction results of the multiple models are fitted to obtain the prediction results of the original data. The method not only considers the self-similarity of a spatial information network, but also takes into account the short correlation caused by network burst information, which is verified by using the measured data of a certain back bone network released by the MAWI working group in 2018. Compared with the typical time series model, the predicted data of hybrid model is closer to the real traffic data and has a smaller relative root means square error, which is more suitable for a spatial information network.
Keywords: Spatial Information Network, Traffic prediction, Wavelet decomposition, Time series model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6371154 Cultivating a Successful Academic Career in Higher Education Institutes: The 10 X C Model
Authors: S. Zamir
Abstract:
The modern era has brought with it significant organizational changes. These changes have not bypassed the academic world, and along with the old academic bonds that include a world of knowledge and ethics, academic faculty members are required more than ever not only to survive in the academic world, but also to thrive and flourish and position themselves as modern and opinionated academicians. Based upon the writings of organizational consultants, the article suggests a 10 X C model for cultivating an academic backbone, as well as emphasizing its input to the professional growth of university and college academics: Competence, Calculations of pain & gain, Character, Commitment, Communication, Curiosity, Coping, Courage, Collaboration and Celebration.
Keywords: Academic career, academicians, higher education, the 10xC Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9551153 Data Preprocessing for Supervised Leaning
Authors: S. B. Kotsiantis, D. Kanellopoulos, P. E. Pintelas
Abstract:
Many factors affect the success of Machine Learning (ML) on a given task. The representation and quality of the instance data is first and foremost. If there is much irrelevant and redundant information present or noisy and unreliable data, then knowledge discovery during the training phase is more difficult. It is well known that data preparation and filtering steps take considerable amount of processing time in ML problems. Data pre-processing includes data cleaning, normalization, transformation, feature extraction and selection, etc. The product of data pre-processing is the final training set. It would be nice if a single sequence of data pre-processing algorithms had the best performance for each data set but this is not happened. Thus, we present the most well know algorithms for each step of data pre-processing so that one achieves the best performance for their data set.Keywords: Data mining, feature selection, data cleaning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 60911152 Project Management Success for Contractors
Authors: Hamimah Adnan, Norfashiha Hashim, Mohd Arif Marhani, Mohd Asri Yeop Johari
Abstract:
The aim of this paper is to provide a better understanding of the implementation of Project Management practices by UiTM contractors to ensure project success. A questionnaire survey was administered to 120 UiTM contractors in Malaysia. The purpose of this method was to gather information on the contractors- project background and project management skills. It was found that all of the contractors had basic knowledge and understanding of project management skills. It is suggested that a reasonable project plan and an appropriate organizational structure are influential factors for project success. It is recommended that the contractors need to have an effective program of work and up to date information system are emphasized.Keywords: Project management, success, contractors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31881151 Retrieval Augmented Generation against the Machine: Merging Human Cyber Security Expertise with Generative AI
Authors: Brennan Lodge
Abstract:
Amidst a complex regulatory landscape, Retrieval Augmented Generation (RAG) emerges as a transformative tool for Governance Risk and Compliance (GRC) officers. This paper details the application of RAG in synthesizing Large Language Models (LLMs) with external knowledge bases, offering GRC professionals an advanced means to adapt to rapid changes in compliance requirements. While the development for standalone LLMs is exciting, such models do have their downsides. LLMs cannot easily expand or revise their memory, and they cannot straightforwardly provide insight into their predictions, and may produce “hallucinations.” Leveraging a pre-trained seq2seq transformer and a dense vector index of domain-specific data, this approach integrates real-time data retrieval into the generative process, enabling gap analysis and the dynamic generation of compliance and risk management content. We delve into the mechanics of RAG, focusing on its dual structure that pairs parametric knowledge contained within the transformer model with non-parametric data extracted from an updatable corpus. This hybrid model enhances decision-making through context-rich insights, drawing from the most current and relevant information, thereby enabling GRC officers to maintain a proactive compliance stance. Our methodology aligns with the latest advances in neural network fine-tuning, providing a granular, token-level application of retrieved information to inform and generate compliance narratives. By employing RAG, we exhibit a scalable solution that can adapt to novel regulatory challenges and cybersecurity threats, offering GRC officers a robust, predictive tool that augments their expertise. The granular application of RAG’s dual structure not only improves compliance and risk management protocols but also informs the development of compliance narratives with pinpoint accuracy. It underscores AI’s emerging role in strategic risk mitigation and proactive policy formation, positioning GRC officers to anticipate and navigate the complexities of regulatory evolution confidently.
Keywords: Retrieval Augmented Generation, Governance Risk and Compliance, Cybersecurity, AI-driven Compliance, Risk Management, Generative AI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1241150 Descriptive Study of Libyan Steles of Grande Kabylia, Algeria
Authors: Samia Ait Ali Yahia
Abstract:
The Libyan steles contain a good number of inscriptions. We find them on blocks of sandstone in the northern part of Grande Kabylia, Algeria. Three Libyan steles recently discovered are added to the currently known and published documents which enrich the Libyan heritage of this region. The aim of this article is to make a descriptive study of the Libyan inscriptions of these steles in order to better understand the characteristics of each stele by comparing them to the different stele already known in the region. It is certain that if other similar specimens were to be added to those we already possess, knowledge of the Libyan would gradually become clearer. The Kabylia region is certainly full of these remains that have not yet been brought to light.
Keywords: Libyan stele, Libyan inscriptions, Paintings, Engraving, Kaylie.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3421149 Achieving Implementable Nature-Based Solutions While Reshaping Architectural Education: A Case Study of URBiNAT and BUILD Solutions
Authors: C. Farinea, A. Conserva, F. Demeur
Abstract:
Nature has often been something humans have fought against. However, with the changing climate and urban challenges such as air pollution and food shortages, to name but a few, it has never been more crucial to work with nature to find solutions that can help us to adapt to the current planetary situation and mitigate the challenges that we will continue to face in the future. Nature-based solutions (NBS) have been gaining ground as one strategy that can help to create more sustainable solutions for our planet and simultaneously, provide several ecosystem services. As designers, there are a lot of insights that can be extracted and gained from nature. However, nature is a complex and sometimes difficult to predict system and its implementation in cities requires a multidisciplinary knowledge. To keep up with the solutions and prepare the future generations of architects and designers with the skills to be able to implement NBS, educational systems also have to adapt with the times. Architecture is no longer solely about drawing buildings with beautiful forms. It is no longer discipline bound. With the input from different disciplines, the implementation of NBS can be significantly more successful. Transdisciplinary strategies can encourage architects and designers to think beyond their discipline, and ensure the success and realization of the NBS. The paper will demonstrate how transdisciplinary teaching methodologies, including also taking part in participatory processes with experts intended as gathering local knowledge, can be implemented with architectural master students to achieve implementable NBS. Through two projects co-funded by the European Union, strategies such as participatory co-design and transdisciplinary start-ups were implemented into seminars that focused on the development of NBS with a transdisciplinary approach. Within the “Design with Living Systems” seminar, students took part in participatory co-design strategies with experts to design solutions that will be implemented in Porto as part of a healthy corridor, and that respond to the needs of the users and site. On the other hand, within the “Design for Living Systems” seminar, the transdisciplinary start-up approach created start-ups with students of architecture, business and biology focusing on identifying a problem and designing a NBS as a product. Both seminars proved to be successful in achieving implementable NBS through strategies of transdisciplinary education and gave the students the skill sets to be able to work with nature in their future careers.
Keywords: Architectural higher education, digital fabrication, nature-based solutions, transdisciplinary approaches.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1451148 Exploring Unexplored Horizons: Advanced Fluid Mechanics Solutions for Sustainable Energy Technologies
Authors: Elvira S. Castillo, Surupa Shaw
Abstract:
This paper explores advanced applications of fluid mechanics in the context of sustainable energy. By examining the integration of fluid dynamics with renewable energy technologies, the research uncovers previously underutilized strategies for improving efficiency. Through theoretical analyses, the study demonstrates how fluid mechanics can be harnessed to optimize renewable energy systems. The findings contribute to expanding knowledge in sustainable energy by offering practical insights and methodologies for future research and technological advancements to address global energy challenges.
Keywords: Fluid mechanics, sustainable energy, energy efficiency, green energy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 461147 Multipath Routing Protocol Using Basic Reconstruction Routing (BRR) Algorithm in Wireless Sensor Network
Authors: K. Rajasekaran, Kannan Balasubramanian
Abstract:
A sensory network consists of multiple detection locations called sensor nodes, each of which is tiny, featherweight and portable. A single path routing protocols in wireless sensor network can lead to holes in the network, since only the nodes present in the single path is used for the data transmission. Apart from the advantages like reduced computation, complexity and resource utilization, there are some drawbacks like throughput, increased traffic load and delay in data delivery. Therefore, multipath routing protocols are preferred for WSN. Distributing the traffic among multiple paths increases the network lifetime. We propose a scheme, for the data to be transmitted through a dominant path to save energy. In order to obtain a high delivery ratio, a basic route reconstruction protocol is utilized to reconstruct the path whenever a failure is detected. A basic reconstruction routing (BRR) algorithm is proposed, in which a node can leap over path failure by using the already existing routing information from its neighbourhood while the composed data is transmitted from the source to the sink. In order to save the energy and attain high data delivery ratio, data is transmitted along a multiple path, which is achieved by BRR algorithm whenever a failure is detected. Further, the analysis of how the proposed protocol overcomes the drawback of the existing protocols is presented. The performance of our protocol is compared to AOMDV and energy efficient node-disjoint multipath routing protocol (EENDMRP). The system is implemented using NS-2.34. The simulation results show that the proposed protocol has high delivery ratio with low energy consumption.Keywords: Multipath routing, WSN, energy efficient routing, alternate route, assured data delivery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17221146 Heuristics Analysis for Distributed Scheduling using MONARC Simulation Tool
Authors: Florin Pop
Abstract:
Simulation is a very powerful method used for highperformance and high-quality design in distributed system, and now maybe the only one, considering the heterogeneity, complexity and cost of distributed systems. In Grid environments, foe example, it is hard and even impossible to perform scheduler performance evaluation in a repeatable and controllable manner as resources and users are distributed across multiple organizations with their own policies. In addition, Grid test-beds are limited and creating an adequately-sized test-bed is expensive and time consuming. Scalability, reliability and fault-tolerance become important requirements for distributed systems in order to support distributed computation. A distributed system with such characteristics is called dependable. Large environments, like Cloud, offer unique advantages, such as low cost, dependability and satisfy QoS for all users. Resource management in large environments address performant scheduling algorithm guided by QoS constrains. This paper presents the performance evaluation of scheduling heuristics guided by different optimization criteria. The algorithms for distributed scheduling are analyzed in order to satisfy users constrains considering in the same time independent capabilities of resources. This analysis acts like a profiling step for algorithm calibration. The performance evaluation is based on simulation. The simulator is MONARC, a powerful tool for large scale distributed systems simulation. The novelty of this paper consists in synthetic analysis results that offer guidelines for scheduler service configuration and sustain the empirical-based decision. The results could be used in decisions regarding optimizations to existing Grid DAG Scheduling and for selecting the proper algorithm for DAG scheduling in various actual situations.Keywords: Scheduling, Simulation, Performance Evaluation, QoS, Distributed Systems, MONARC
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17521145 Integrated Reasoning Approach for Car Faulty Diagnosis
Authors: Diana M.L. Wong
Abstract:
This paper presents an integrated case based and rule based reasoning method for car faulty diagnosis. The reasoning method is done through extracting the past cases from the Proton Service Center while comparing with the preset rules to deduce a diagnosis/solution to a car service case. New cases will be stored to the knowledge base. The test cases examples illustrate the effectiveness of the proposed integrated reasoning. It has proven accuracy of similar reasoning if carried out by a service advisor from the service center.Keywords: component; case based reasoning (CBR), rule basedreasoning (RBR), decision support systems, diagnosis tool.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19261144 A Brief Study about Nonparametric Adherence Tests
Authors: Vinicius R. Domingues, Luan C. S. M. Ozelim
Abstract:
The statistical study has become indispensable for various fields of knowledge. Not any different, in Geotechnics the study of probabilistic and statistical methods has gained power considering its use in characterizing the uncertainties inherent in soil properties. One of the situations where engineers are constantly faced is the definition of a probability distribution that represents significantly the sampled data. To be able to discard bad distributions, goodness-of-fit tests are necessary. In this paper, three non-parametric goodness-of-fit tests are applied to a data set computationally generated to test the goodness-of-fit of them to a series of known distributions. It is shown that the use of normal distribution does not always provide satisfactory results regarding physical and behavioral representation of the modeled parameters.Keywords: Kolmogorov-Smirnov, Anderson-Darling, Cramer-Von-Mises, Nonparametric adherence tests.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1843