Search results for: innovation maturity models
7745 Improving the Quantification Model of Internal Control Impact on Banking Risks
Authors: M. Ndaw, G. Mendy, S. Ouya
Abstract:
Risk management in banking sector is a key issue linked to financial system stability and its importance has been elevated by technological developments and emergence of new financial instruments. In this paper, we improve the model previously defined for quantifying internal control impact on banking risks by automatizing the residual criticality estimation step of FMECA. For this, we defined three equations and a maturity coefficient to obtain a mathematical model which is tested on all banking processes and type of risks. The new model allows an optimal assessment of residual criticality and improves the correlation rate that has become 98%.Keywords: risk, control, banking, FMECA, criticality
Procedia PDF Downloads 3337744 Modeling Pan Evaporation Using Intelligent Methods of ANN, LSSVM and Tree Model M5 (Case Study: Shahroud and Mayamey Stations)
Authors: Hamidreza Ghazvinian, Khosro Ghazvinian, Touba Khodaiean
Abstract:
The importance of evaporation estimation in water resources and agricultural studies is undeniable. Pan evaporation are used as an indicator to determine the evaporation of lakes and reservoirs around the world due to the ease of interpreting its data. In this research, intelligent models were investigated in estimating pan evaporation on a daily basis. Shahroud and Mayamey were considered as the studied cities. These two cities are located in Semnan province in Iran. The mentioned cities have dry weather conditions that are susceptible to high evaporation potential. Meteorological data of 11 years of synoptic stations of Shahrood and Mayamey cities were used. The intelligent models used in this study are Artificial Neural Network (ANN), Least Squares Support Vector Machine (LSSVM), and M5 tree models. Meteorological parameters of minimum and maximum air temperature (Tmax, Tmin), wind speed (WS), sunshine hours (SH), air pressure (PA), relative humidity (RH) as selected input data and evaporation data from pan (EP) to The output data was considered. 70% of data is used at the education level, and 30 % of the data is used at the test level. Models used with explanation coefficient evaluation (R2) Root of Mean Squares Error (RMSE) and Mean Absolute Error (MAE). The results for the two Shahroud and Mayamey stations showed that the above three models' operations are rather appropriate.Keywords: pan evaporation, intelligent methods, shahroud, mayamey
Procedia PDF Downloads 747743 Leadership in the Emergence Paradigm: A Literature Review on the Medusa Principles
Authors: Everard van Kemenade
Abstract:
Many quality improvement activities are planned. Leaders are strongly involved in missions, visions and strategic planning. They use, consciously or unconsciously, the PDCA-cycle, also know as the Deming cycle. After the planning, the plans are carried out and the results or effects are measured. If the results show that the goals in the plan have not been achieved, adjustments are made in the next plan or in the execution of the processes. Then, the cycle is run through again. Traditionally, the PDCA-cycle is advocated as a means to an end. However, PDCA is especially fit for planned, ordered, certain contexts. It fits with the empirical and referential quality paradigm. For uncertain, unordered, unplanned processes, something else might be needed instead of Plan-Do-Check-Act. Due to the complexity of our society, the influence of the context, and the uncertainty in our world nowadays, not every activity can be planned anymore. At the same time organisations need to be more innovative than ever. That provides leaders with ‘wicked tendencies’. However, that raises the question how one can innovate without being able to plan? Complexity science studies the interactions of a diverse group of agents that bring about change in times of uncertainty, e.g. when radical innovation is co-created. This process is called emergence. This research study explores the role of leadership in the emergence paradigm. Aim of the article is to study the way that leadership can support the emergence of innovation in a complex context. First, clarity is given on the concepts used in the research question: complexity, emergence, innovation and leadership. Thereafter a literature search is conducted to answer the research question. The topics ‘emergent leadership’ or ‘complexity leadership’ are chosen for an exploratory search in Google and Google Scholar using the berry picking method. Exclusion criterion is emergence in other disciplines than organizational development or in the meaning of ‘arising’. The literature search conducted gave 45 hits. Twenty-seven articles were excluded after reading the title and abstract because they did not research the topic of emergent leadership and complexity. After reading the remaining articles as a whole one more was excluded because the article used emergent in the limited meaning of ‗arising‘ and eight more were excluded because the topic did not match the research question of this article. That brings the total of the search to 17 articles. The useful conclusions from the articles are merged and grouped together under overarching topics, using thematic analysis. The findings are that 5 topics prevail when looking at possibilities for leadership to facilitate innovation: enabling, sharing values, dreaming, interacting, context sensitivity and adaptivity. Together they form In Dutch the acronym Medusa.Keywords: complexity science, emergence, leadership in the emergence paradigm, innovation, the Medusa principles
Procedia PDF Downloads 297742 Multilevel Modeling of the Progression of HIV/AIDS Disease among Patients under HAART Treatment
Authors: Awol Seid Ebrie
Abstract:
HIV results as an incurable disease, AIDS. After a person is infected with virus, the virus gradually destroys all the infection fighting cells called CD4 cells and makes the individual susceptible to opportunistic infections which cause severe or fatal health problems. Several studies show that the CD4 cells count is the most determinant indicator of the effectiveness of the treatment or progression of the disease. The objective of this paper is to investigate the progression of the disease over time among patient under HAART treatment. Two main approaches of the generalized multilevel ordinal models; namely the proportional odds model and the nonproportional odds model have been applied to the HAART data. Also, the multilevel part of both models includes random intercepts and random coefficients. In general, four models are explored in the analysis and then the models are compared using the deviance information criteria. Of these models, the random coefficients nonproportional odds model is selected as the best model for the HAART data used as it has the smallest DIC value. The selected model shows that the progression of the disease increases as the time under the treatment increases. In addition, it reveals that gender, baseline clinical stage and functional status of the patient have a significant association with the progression of the disease.Keywords: nonproportional odds model, proportional odds model, random coefficients model, random intercepts model
Procedia PDF Downloads 4217741 Population Dynamics of Cyprinid Fish Species (Mahseer: Tor Species) and Its Conservation in Yamuna River of Garhwal Region, India
Authors: Davendra Singh Malik
Abstract:
India is one of the mega-biodiversity countries in the world and contributing about 11.72% of global fish diversity. The Yamuna river is the longest tributary of Ganga river ecosystem, providing a natural habitat for existing fish diversity of Himalayan region of Indian subcontinent. The several hydropower dams and barrages have been constructed on different locations of major rivers in Garhwal region. These dams have caused a major ecological threat to change existing fresh water ecosystems altering water flows, interrupting ecological connectivity, fragmenting habitats and native riverine fish species. Mahseer fishes (Indian carp) of the genus Tor, are large cyprinids endemic to continental Asia popularly known as ‘Game or sport fishes’ have continued to be decimated by fragmented natural habitats due to damming the water flow in riverine system and categorized as threatened fishes of India. The fresh water fish diversity as 24 fish species were recorded from Yamuna river. The present fish catch data has revealed that mahseer fishes (Tor tor and Tor putitora) were contributed about 32.5 %, 25.6 % and 18.2 % in upper, middle and lower riverine stretches of Yaumna river. The length range of mahseer (360-450mm) recorded as dominant size of catch composition. The CPUE (catch per unit effort) of mahseer fishes also indicated about a sharp decline of fish biomass, changing growth pattern, sex ratio and maturity stages of fishes. Only 12.5 – 14.8 % mahseer female brooders have showed only maturity phases in breeding months. The fecundity of mature mahseer female fish brooders ranged from 2500-4500 no. of ova during breeding months. The present status of mahseer fishery has attributed to the over exploitative nature in Yamuna river. The mahseer population is shrinking continuously in down streams of Yamuna river due to cumulative effects of various ecological stress. Mahseer conservation programme have implemented as 'in situ fish conservation' for enhancement of viable population size of mahseer species and restore the genetic loss of mahseer fish germplasm in Yamuna river of Garhwal Himalayan region.Keywords: conservation practice, population dynamics, tor fish species, Yamuna River
Procedia PDF Downloads 2557740 Industry 4.0 Platforms as 'Cluster' ecosystems for small and medium enterprises (SMEs)
Authors: Vivek Anand, Rainer Naegele
Abstract:
Industry 4.0 is a global mega-trend revolutionizing the world of advanced manufacturing, but also bringing up challenges for SMEs. In response, many regional, as well as digital Industry 4.0 Platforms, have been set up to boost the competencies of established enterprises as well as SMEs. The concept of 'Clusters' is a policy tool that aims to be a starting point to establish sustainable and self-supporting structures in industries of a region by identifying competencies and supporting cluster actors with services that match their growth needs. This paper is motivated by the idea that Clusters have the potential to enable firms, particularly SMEs, to accelerate the innovation process and transition to digital technologies. In this research, the efficacy of Industry 4.0 platforms as Cluster ecosystems is evaluated, especially for SMEs. Focusing on the Baden Wurttemberg region in Germany, an action research method is employed to study how SMEs leverage other actors on Industry 4.0 Platforms to further their Industry 4.0 journeys. The aim is to evaluate how such Industry 4.0 platforms stimulate innovation, cooperation and competitiveness. Additionally, the barriers to these platforms fulfilling their promise to serve as capacity building cluster ecosystems for SMEs in a region will also be identified. The findings will be helpful for academicians and policymakers alike, who can leverage a ‘cluster policy’ to enable Industry 4.0 ecosystems in their regions. Furthermore, relevant management and policy implications stem from the analysis. This will also be of interest to the various players in a cluster ecosystem - like SMEs and service providers - who benefit from the cooperation and competition. The paper will improve the understanding of how a dialogue orientation, a bottom-up approach and active integration of all involved cluster actors enhance the potential of Industry 4.0 Platforms. A strong collaborative culture is a key driver of digital transformation and technology adoption across sectors, value chains and supply chains; and will position Industry 4.0 Platforms at the forefront of the industrial renaissance. Motivated by this argument and based on the results of the qualitative research, a roadmap will be proposed to position Industry 4.0 Platforms as effective clusters ecosystems to support Industry 4.0 adoption in a region.Keywords: cluster policy, digital transformation, industry 4.0, innovation clusters, innovation policy, SMEs and startups
Procedia PDF Downloads 2227739 Impact of Data and Model Choices to Urban Flood Risk Assessments
Authors: Abhishek Saha, Serene Tay, Gerard Pijcke
Abstract:
The availability of high-resolution topography and rainfall information in urban areas has made it necessary to revise modeling approaches used for simulating flood risk assessments. Lidar derived elevation models that have 1m or lower resolutions are becoming widely accessible. The classical approaches of 1D-2D flow models where channel flow is simulated and coupled with a coarse resolution 2D overland flow models may not fully utilize the information provided by high-resolution data. In this context, a study was undertaken to compare three different modeling approaches to simulate flooding in an urban area. The first model used is the base model used is Sobek, which uses 1D model formulation together with hydrologic boundary conditions and couples with an overland flow model in 2D. The second model uses a full 2D model for the entire area with shallow water equations at the resolution of the digital elevation model (DEM). These models are compared against another shallow water equation solver in 2D, which uses a subgrid method for grid refinement. These models are simulated for different horizontal resolutions of DEM varying between 1m to 5m. The results show a significant difference in inundation extents and water levels for different DEMs. They are also sensitive to the different numerical models with the same physical parameters, such as friction. The study shows the importance of having reliable field observations of inundation extents and levels before a choice of model and data can be made for spatial flood risk assessments.Keywords: flooding, DEM, shallow water equations, subgrid
Procedia PDF Downloads 1417738 Development of Prediction Models of Day-Ahead Hourly Building Electricity Consumption and Peak Power Demand Using the Machine Learning Method
Authors: Dalin Si, Azizan Aziz, Bertrand Lasternas
Abstract:
To encourage building owners to purchase electricity at the wholesale market and reduce building peak demand, this study aims to develop models that predict day-ahead hourly electricity consumption and demand using artificial neural network (ANN) and support vector machine (SVM). All prediction models are built in Python, with tool Scikit-learn and Pybrain. The input data for both consumption and demand prediction are time stamp, outdoor dry bulb temperature, relative humidity, air handling unit (AHU), supply air temperature and solar radiation. Solar radiation, which is unavailable a day-ahead, is predicted at first, and then this estimation is used as an input to predict consumption and demand. Models to predict consumption and demand are trained in both SVM and ANN, and depend on cooling or heating, weekdays or weekends. The results show that ANN is the better option for both consumption and demand prediction. It can achieve 15.50% to 20.03% coefficient of variance of root mean square error (CVRMSE) for consumption prediction and 22.89% to 32.42% CVRMSE for demand prediction, respectively. To conclude, the presented models have potential to help building owners to purchase electricity at the wholesale market, but they are not robust when used in demand response control.Keywords: building energy prediction, data mining, demand response, electricity market
Procedia PDF Downloads 3167737 Exploring Time-Series Phosphoproteomic Datasets in the Context of Network Models
Authors: Sandeep Kaur, Jenny Vuong, Marcel Julliard, Sean O'Donoghue
Abstract:
Time-series data are useful for modelling as they can enable model-evaluation. However, when reconstructing models from phosphoproteomic data, often non-exact methods are utilised, as the knowledge regarding the network structure, such as, which kinases and phosphatases lead to the observed phosphorylation state, is incomplete. Thus, such reactions are often hypothesised, which gives rise to uncertainty. Here, we propose a framework, implemented via a web-based tool (as an extension to Minardo), which given time-series phosphoproteomic datasets, can generate κ models. The incompleteness and uncertainty in the generated model and reactions are clearly presented to the user via the visual method. Furthermore, we demonstrate, via a toy EGF signalling model, the use of algorithmic verification to verify κ models. Manually formulated requirements were evaluated with regards to the model, leading to the highlighting of the nodes causing unsatisfiability (i.e. error causing nodes). We aim to integrate such methods into our web-based tool and demonstrate how the identified erroneous nodes can be presented to the user via the visual method. Thus, in this research we present a framework, to enable a user to explore phosphorylation proteomic time-series data in the context of models. The observer can visualise which reactions in the model are highly uncertain, and which nodes cause incorrect simulation outputs. A tool such as this enables an end-user to determine the empirical analysis to perform, to reduce uncertainty in the presented model - thus enabling a better understanding of the underlying system.Keywords: κ-models, model verification, time-series phosphoproteomic datasets, uncertainty and error visualisation
Procedia PDF Downloads 2557736 Optical and Double Folding Analysis for 6Li+16O Elastic Scattering
Authors: Abd Elrahman Elgamala, N. Darwish, I. Bondouk, Sh. Hamada
Abstract:
Available experimental angular distributions for 6Li elastically scattered from 16O nucleus in the energy range 13.0–50.0 MeV are investigated and reanalyzed using optical model of the conventional phenomenological potential and also using double folding optical model of different interaction models: DDM3Y1, CDM3Y1, CDM3Y2, and CDM3Y3. All the involved models of interaction are of M3Y Paris except DDM3Y1 which is of M3Y Reid and the main difference between them lies in the different values for the parameters of the incorporated density distribution function F(ρ). We have extracted the renormalization factor NR for 6Li+16O nuclear system in the energy range 13.0–50.0 MeV using the aforementioned interaction models.Keywords: elastic scattering, optical model, folding potential, density distribution
Procedia PDF Downloads 1417735 An Interpretable Data-Driven Approach for the Stratification of the Cardiorespiratory Fitness
Authors: D.Mendes, J. Henriques, P. Carvalho, T. Rocha, S. Paredes, R. Cabiddu, R. Trimer, R. Mendes, A. Borghi-Silva, L. Kaminsky, E. Ashley, R. Arena, J. Myers
Abstract:
The continued exploration of clinically relevant predictive models continues to be an important pursuit. Cardiorespiratory fitness (CRF) portends clinical vital information and as such its accurate prediction is of high importance. Therefore, the aim of the current study was to develop a data-driven model, based on computational intelligence techniques and, in particular, clustering approaches, to predict CRF. Two prediction models were implemented and compared: 1) the traditional Wasserman/Hansen Equations; and 2) an interpretable clustering approach. Data used for this analysis were from the 'FRIEND - Fitness Registry and the Importance of Exercise: The National Data Base'; in the present study a subset of 10690 apparently healthy individuals were utilized. The accuracy of the models was performed through the computation of sensitivity, specificity, and geometric mean values. The results show the superiority of the clustering approach in the accurate estimation of CRF (i.e., maximal oxygen consumption).Keywords: cardiorespiratory fitness, data-driven models, knowledge extraction, machine learning
Procedia PDF Downloads 2867734 Development of Cross Curricular Competences in University Classrooms: Public Speaking
Authors: M. T. Becerra, F. Martín, P. Gutiérrez, S. Cubo, E. Iglesias, A. A. Sáenz del Castillo, P. Cañamero
Abstract:
The consolidation of the European Higher Education Area (EHEA) in universities has led to significant changes in student training. This paper, part of a Teaching Innovation Project, starts from new training requirements that are fit within Undergraduate Thesis Project, a subject that culminate student learning. Undergraduate Thesis Project is current assessment system that weigh the student acquired training in university education. Students should develop a range of cross curricular competences such as public presentation of ideas, problems and solutions both orally and writing in Undergraduate Thesis Project. Specifically, we intend with our innovation proposal to provide resources that enable university students from Teacher Degree in Education Faculty of University of Extremadura (Spain) to develop the cross curricular competence of public speaking.Keywords: interaction, public speaking, student, university
Procedia PDF Downloads 4397733 Fiscal Stability Indicators and Public Debt Trajectory in Croatia
Authors: Hrvoje Simovic
Abstract:
Paper analyses the key problems of fiscal sustainability in Croatia. To point out key challenges of fiscal sustainability, the public debt sustainability is analyzed using standard indicators of fiscal stability, accompanied with the identification of regime changes approach in the public debt trajectory using switching regression approach. The analysis is conducted for the period from 2001 to 2016. Results show huge vulnerability in recession period (2009-14), so key challenges in current fiscal policy and public debt management are recognized in maturity prolongation, interest rates trends, and credit rating expectations.Keywords: fiscal sustainability, public debt, Croatia, budget deficit
Procedia PDF Downloads 2607732 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach
Authors: Elias K. Maragos, Petros E. Maravelakis
Abstract:
In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.Keywords: Dynamic Data Envelopment Analysis, DDEA, piecewise linear inputs, piecewise linear outputs
Procedia PDF Downloads 1617731 Models of Copyrights System
Authors: A. G. Matveev
Abstract:
The copyrights system is a combination of different elements. The number, content and the correlation of these elements are different for different legal orders. The models of copyrights systems display this system in terms of the interaction of economic and author's moral rights. Monistic and dualistic models are the most popular ones. The article deals with different points of view on the monism and dualism in copyright system. A specific model of the copyright in Switzerland in the XXth century is analyzed. The evolution of a French dualistic model of copyright is shown. The author believes that one should talk not about one, but rather about a number of dualism forms of copyright system.Keywords: copyright, exclusive copyright, economic rights, author's moral rights, rights of personality, monistic model, dualistic model
Procedia PDF Downloads 4207730 Semantic Textual Similarity on Contracts: Exploring Multiple Negative Ranking Losses for Sentence Transformers
Authors: Yogendra Sisodia
Abstract:
Researchers are becoming more interested in extracting useful information from legal documents thanks to the development of large-scale language models in natural language processing (NLP), and deep learning has accelerated the creation of powerful text mining models. Legal fields like contracts benefit greatly from semantic text search since it makes it quick and easy to find related clauses. After collecting sentence embeddings, it is relatively simple to locate sentences with a comparable meaning throughout the entire legal corpus. The author of this research investigated two pre-trained language models for this task: MiniLM and Roberta, and further fine-tuned them on Legal Contracts. The author used Multiple Negative Ranking Loss for the creation of sentence transformers. The fine-tuned language models and sentence transformers showed promising results.Keywords: legal contracts, multiple negative ranking loss, natural language inference, sentence transformers, semantic textual similarity
Procedia PDF Downloads 1077729 Pilot Induced Oscillations Adaptive Suppression in Fly-By-Wire Systems
Authors: Herlandson C. Moura, Jorge H. Bidinotto, Eduardo M. Belo
Abstract:
The present work proposes the development of an adaptive control system which enables the suppression of Pilot Induced Oscillations (PIO) in Digital Fly-By-Wire (DFBW) aircrafts. The proposed system consists of a Modified Model Reference Adaptive Control (M-MRAC) integrated with the Gain Scheduling technique. The PIO oscillations are detected using a Real Time Oscillation Verifier (ROVER) algorithm, which then enables the system to switch between two reference models; one in PIO condition, with low proneness to the phenomenon and another one in normal condition, with high (or medium) proneness. The reference models are defined in a closed loop condition using the Linear Quadratic Regulator (LQR) control methodology for Multiple-Input-Multiple-Output (MIMO) systems. The implemented algorithms are simulated in software implementations with state space models and commercial flight simulators as the controlled elements and with pilot dynamics models. A sequence of pitch angles is considered as the reference signal, named as Synthetic Task (Syntask), which must be tracked by the pilot models. The initial outcomes show that the proposed system can detect and suppress (or mitigate) the PIO oscillations in real time before it reaches high amplitudes.Keywords: adaptive control, digital Fly-By-Wire, oscillations suppression, PIO
Procedia PDF Downloads 1347728 The Use of AI to Measure Gross National Happiness
Authors: Riona Dighe
Abstract:
This research attempts to identify an alternative approach to the measurement of Gross National Happiness (GNH). It uses artificial intelligence (AI), incorporating natural language processing (NLP) and sentiment analysis to measure GNH. We use ‘off the shelf’ NLP models responsible for the sentiment analysis of a sentence as a building block for this research. We constructed an algorithm using NLP models to derive a sentiment analysis score against sentences. This was then tested against a sample of 20 respondents to derive a sentiment analysis score. The scores generated resembled human responses. By utilising the MLP classifier, decision tree, linear model, and K-nearest neighbors, we were able to obtain a test accuracy of 89.97%, 54.63%, 52.13%, and 47.9%, respectively. This gave us the confidence to use the NLP models against sentences in websites to measure the GNH of a country.Keywords: artificial intelligence, NLP, sentiment analysis, gross national happiness
Procedia PDF Downloads 1197727 Deep Learning for Renewable Power Forecasting: An Approach Using LSTM Neural Networks
Authors: Fazıl Gökgöz, Fahrettin Filiz
Abstract:
Load forecasting has become crucial in recent years and become popular in forecasting area. Many different power forecasting models have been tried out for this purpose. Electricity load forecasting is necessary for energy policies, healthy and reliable grid systems. Effective power forecasting of renewable energy load leads the decision makers to minimize the costs of electric utilities and power plants. Forecasting tools are required that can be used to predict how much renewable energy can be utilized. The purpose of this study is to explore the effectiveness of LSTM-based neural networks for estimating renewable energy loads. In this study, we present models for predicting renewable energy loads based on deep neural networks, especially the Long Term Memory (LSTM) algorithms. Deep learning allows multiple layers of models to learn representation of data. LSTM algorithms are able to store information for long periods of time. Deep learning models have recently been used to forecast the renewable energy sources such as predicting wind and solar energy power. Historical load and weather information represent the most important variables for the inputs within the power forecasting models. The dataset contained power consumption measurements are gathered between January 2016 and December 2017 with one-hour resolution. Models use publicly available data from the Turkish Renewable Energy Resources Support Mechanism. Forecasting studies have been carried out with these data via deep neural networks approach including LSTM technique for Turkish electricity markets. 432 different models are created by changing layers cell count and dropout. The adaptive moment estimation (ADAM) algorithm is used for training as a gradient-based optimizer instead of SGD (stochastic gradient). ADAM performed better than SGD in terms of faster convergence and lower error rates. Models performance is compared according to MAE (Mean Absolute Error) and MSE (Mean Squared Error). Best five MAE results out of 432 tested models are 0.66, 0.74, 0.85 and 1.09. The forecasting performance of the proposed LSTM models gives successful results compared to literature searches.Keywords: deep learning, long short term memory, energy, renewable energy load forecasting
Procedia PDF Downloads 2667726 Technological Innovation and Efficiency of Production of the Greek Aquaculture Industry
Authors: C. Nathanailides, S. Anastasiou, A. Dimitroglou, P. Logothetis, G. Kanlis
Abstract:
In the present work we reviewed historical data of the Greek Marine aquaculture industry including adoption of new methods and technological innovation. The results indicate that the industry exhibited a rapid rise in production efficiency, employment and adoption of new technologies which reduced outbreaks of diseases, reduced production risk and the price of the farmed fish. The improvements of total quality practices and technological input on the Greek Aquaculture industry include improved survival, growth and body shape of farmed fish, which resulted from development of new aquaculture feeds and the genetic selection of the bloodstock. Also improvements in the quality of the final product were achieved via technological input in the methods and technology applied during harvesting, packaging, and transportation-preservation of farmed fish ensuring high quality of the product from the fish farm to the plate of the consumers. These parameters (health management, nutrition, genetics, harvesting and post-harvesting methods and technology) changed significantly over the last twenty years and the results of these improvements are reflected in the production efficiency of the Aquaculture industry and the quality of the final product. It is concluded that the Greek aquaculture industry exhibited a rapid growth, adoption of technologies and supply was stabilized after the global financial crisis, nevertheless, the development of the Greek aquaculture industry is currently limited by international trade sanctions, credit crunch, and increased taxation and not by limited technology or resources.Keywords: innovation, aquaculture, total quality, management
Procedia PDF Downloads 3727725 Predict Suspended Sediment Concentration Using Artificial Neural Networks Technique: Case Study Oued El Abiod Watershed, Algeria
Authors: Adel Bougamouza, Boualam Remini, Abd El Hadi Ammari, Feteh Sakhraoui
Abstract:
The assessment of sediments being carried by a river is importance for planning and designing of various water resources projects. In this study, Artificial Neural Network Techniques are used to estimate the daily suspended sediment concentration for the corresponding daily discharge flow in the upstream of Foum El Gherza dam, Biskra, Algeria. The FFNN, GRNN, and RBNN models are established for estimating current suspended sediment values. Some statistics involving RMSE and R2 were used to evaluate the performance of applied models. The comparison of three AI models showed that the RBNN model performed better than the FFNN and GRNN models with R2 = 0.967 and RMSE= 5.313 mg/l. Therefore, the ANN model had capability to improve nonlinear relationships between discharge flow and suspended sediment with reasonable precision.Keywords: artificial neural network, Oued Abiod watershed, feedforward network, generalized regression network, radial basis network, sediment concentration
Procedia PDF Downloads 4187724 Kinetic Façade Design Using 3D Scanning to Convert Physical Models into Digital Models
Authors: Do-Jin Jang, Sung-Ah Kim
Abstract:
In designing a kinetic façade, it is hard for the designer to make digital models due to its complex geometry with motion. This paper aims to present a methodology of converting a point cloud of a physical model into a single digital model with a certain topology and motion. The method uses a Microsoft Kinect sensor, and color markers were defined and applied to three paper folding-inspired designs. Although the resulted digital model cannot represent the whole folding range of the physical model, the method supports the designer to conduct a performance-oriented design process with the rough physical model in the reduced folding range.Keywords: design media, kinetic facades, tangible user interface, 3D scanning
Procedia PDF Downloads 4137723 Animal Modes of Surgical or Other External Causes of Trauma Wound Infection
Authors: Ojoniyi Oluwafeyekikunmi Okiki
Abstract:
Notwithstanding advances in disturbing wound care and control, infections remain a main motive of mortality, morbidity, and financial disruption in tens of millions of wound sufferers around the sector. Animal models have become popular gear for analyzing a big selection of outside worrying wound infections and trying out new antimicrobial techniques. This evaluation covers experimental infections in animal models of surgical wounds, pores and skin abrasions, burns, lacerations, excisional wounds, and open fractures. Animal modes of external stressful wound infections stated via extraordinary investigators vary in animal species used, microorganism traces, the quantity of microorganisms carried out, the dimensions of the wounds, and, for burn infections, the period of time the heated object or liquid is in contact with the skin. As antibiotic resistance continues to grow, new antimicrobial procedures are urgently needed. Those have to be examined using popular protocols for infections in external stressful wounds in animal models.Keywords: surgical wounds, animals, wound infections, burns, wound models, colony-forming gadgets, lacerated wounds
Procedia PDF Downloads 87722 A Framework for Auditing Multilevel Models Using Explainability Methods
Authors: Debarati Bhaumik, Diptish Dey
Abstract:
Multilevel models, increasingly deployed in industries such as insurance, food production, and entertainment within functions such as marketing and supply chain management, need to be transparent and ethical. Applications usually result in binary classification within groups or hierarchies based on a set of input features. Using open-source datasets, we demonstrate that popular explainability methods, such as SHAP and LIME, consistently underperform inaccuracy when interpreting these models. They fail to predict the order of feature importance, the magnitudes, and occasionally even the nature of the feature contribution (negative versus positive contribution to the outcome). Besides accuracy, the computational intractability of SHAP for binomial classification is a cause of concern. For transparent and ethical applications of these hierarchical statistical models, sound audit frameworks need to be developed. In this paper, we propose an audit framework for technical assessment of multilevel regression models focusing on three aspects: (i) model assumptions & statistical properties, (ii) model transparency using different explainability methods, and (iii) discrimination assessment. To this end, we undertake a quantitative approach and compare intrinsic model methods with SHAP and LIME. The framework comprises a shortlist of KPIs, such as PoCE (Percentage of Correct Explanations) and MDG (Mean Discriminatory Gap) per feature, for each of these three aspects. A traffic light risk assessment method is furthermore coupled to these KPIs. The audit framework will assist regulatory bodies in performing conformity assessments of AI systems using multilevel binomial classification models at businesses. It will also benefit businesses deploying multilevel models to be future-proof and aligned with the European Commission’s proposed Regulation on Artificial Intelligence.Keywords: audit, multilevel model, model transparency, model explainability, discrimination, ethics
Procedia PDF Downloads 947721 IT and Security Experts' Innovation and Investment Front for IT-Entrepreneurship in Pakistan
Authors: Ahmed Mateen, Zhu Qingsheng, Muhammad Awais, Muhammad Yahya Saeed
Abstract:
This paper targets the rising factor of entrepreneurship innovation, which lacks in Pakistan as compared to the other countries or the regions like China, India, and Malaysia, etc. This is an exploratory and explanatory study. Major aspects have identified as the direction for the policymakers while highlighting the issues in true spirit. IT needs to be considered not only as a technology but also as itself growing as a new community. IT management processes are complex and broad, so generally requires extensive attention to the collective aspects of human variables, capital and technology. In addition, projects tend to have a special set of critical success factors, and if these are processed and given attention, it will improve the chances of successful implementation. This is only possible with state of the art intelligent decision support systems and accumulating IT staff to some extent in decision processes. This paper explores this issue carefully and discusses six issues to observe the implemented strength and possible enhancement.Keywords: security and defense forces, IT-incentives, big IT-players, IT-entrepreneurial-culture
Procedia PDF Downloads 2207720 Probabilistic Models to Evaluate Seismic Liquefaction In Gravelly Soil Using Dynamic Penetration Test and Shear Wave Velocity
Authors: Nima Pirhadi, Shao Yong Bo, Xusheng Wan, Jianguo Lu, Jilei Hu
Abstract:
Although gravels and gravelly soils are assumed to be non-liquefiable because of high conductivity and small modulus; however, the occurrence of this phenomenon in some historical earthquakes, especially recently earthquakes during 2008 Wenchuan, Mw= 7.9, 2014 Cephalonia, Greece, Mw= 6.1 and 2016, Kaikoura, New Zealand, Mw = 7.8, has been promoted the essential consideration to evaluate risk assessment and hazard analysis of seismic gravelly soil liquefaction. Due to the limitation in sampling and laboratory testing of this type of soil, in situ tests and site exploration of case histories are the most accepted procedures. Of all in situ tests, dynamic penetration test (DPT), Which is well known as the Chinese dynamic penetration test, and shear wave velocity (Vs) test, have been demonstrated high performance to evaluate seismic gravelly soil liquefaction. However, the lack of a sufficient number of case histories provides an essential limitation for developing new models. This study at first investigates recent earthquakes that caused liquefaction in gravelly soils to collect new data. Then, it adds these data to the available literature’s dataset to extend them and finally develops new models to assess seismic gravelly soil liquefaction. To validate the presented models, their results are compared to extra available models. The results show the reasonable performance of the proposed models and the critical effect of gravel content (GC)% on the assessment.Keywords: liquefaction, gravel, dynamic penetration test, shear wave velocity
Procedia PDF Downloads 2017719 Predictive Models for Compressive Strength of High Performance Fly Ash Cement Concrete for Pavements
Authors: S. M. Gupta, Vanita Aggarwal, Som Nath Sachdeva
Abstract:
The work reported through this paper is an experimental work conducted on High Performance Concrete (HPC) with super plasticizer with the aim to develop some models suitable for prediction of compressive strength of HPC mixes. In this study, the effect of varying proportions of fly ash (0% to 50% at 10% increment) on compressive strength of high performance concrete has been evaluated. The mix designs studied were M30, M40 and M50 to compare the effect of fly ash addition on the properties of these concrete mixes. In all eighteen concrete mixes have been designed, three as conventional concretes for three grades under discussion and fifteen as HPC with fly ash with varying percentages of fly ash. The concrete mix designing has been done in accordance with Indian standard recommended guidelines i.e. IS: 10262. All the concrete mixes have been studied in terms of compressive strength at 7 days, 28 days, 90 days and 365 days. All the materials used have been kept same throughout the study to get a perfect comparison of values of results. The models for compressive strength prediction have been developed using Linear Regression method (LR), Artificial Neural Network (ANN) and Leave One Out Validation (LOOV) methods.Keywords: high performance concrete, fly ash, concrete mixes, compressive strength, strength prediction models, linear regression, ANN
Procedia PDF Downloads 4447718 Continuous Improvement Model for Creative Industries Development
Authors: Rolandas Strazdas, Jurate Cerneviciute
Abstract:
Creative industries are defined as those industries which produce tangible or intangible artistic and creative output and have a potential for income generation by exploitingcultural assets and producing knowledge-based goods and services (both traditional and contemporary). With the emergence of an entire sector of creative industriestriggered by the development of creative products managingcreativity-based business processes becomes a critical issue. Diverse managerial practices and models on effective management of creativity have beenexamined in scholarly literature. Even thoughthese studies suggest how creativity in organisations can be nourished, they do not sufficiently relate the proposed practices to the underlying business processes. The article analyses a range of business process improvement methods such as PDCA, DMAIC, DMADV and TOC. The strengths and weaknesses of these methods aimed to improvethe innovation development process are identified. Based on the analysis of the existing improvement methods, a continuous improvement model was developed and presented in the article.Keywords: continuous improvement, creative industries, improvement model, process mapping
Procedia PDF Downloads 4687717 Evaluating the Suitability and Performance of Dynamic Modulus Predictive Models for North Dakota’s Asphalt Mixtures
Authors: Duncan Oteki, Andebut Yeneneh, Daba Gedafa, Nabil Suleiman
Abstract:
Most agencies lack the equipment required to measure the dynamic modulus (|E*|) of asphalt mixtures, necessitating the need to use predictive models. This study compared measured |E*| values for nine North Dakota asphalt mixes using the original Witczak, modified Witczak, and Hirsch models. The influence of temperature on the |E*| models was investigated, and Pavement ME simulations were conducted using measured |E*| and predictions from the most accurate |E*| model. The results revealed that the original Witczak model yielded the lowest Se/Sy and highest R² values, indicating the lowest bias and highest accuracy, while the poorest overall performance was exhibited by the Hirsch model. Using predicted |E*| as inputs in the Pavement ME generated conservative distress predictions compared to using measured |E*|. The original Witczak model was recommended for predicting |E*| for low-reliability pavements in North Dakota.Keywords: asphalt mixture, binder, dynamic modulus, MEPDG, pavement ME, performance, prediction
Procedia PDF Downloads 477716 Domain specific Ontology-Based Knowledge Extraction Using R-GNN and Large Language Models
Authors: Andrey Khalov
Abstract:
The rapid proliferation of unstructured data in IT infrastructure management demands innovative approaches for extracting actionable knowledge. This paper presents a framework for ontology-based knowledge extraction that combines relational graph neural networks (R-GNN) with large language models (LLMs). The proposed method leverages the DOLCE framework as the foundational ontology, extending it with concepts from ITSMO for domain-specific applications in IT service management and outsourcing. A key component of this research is the use of transformer-based models, such as DeBERTa-v3-large, for automatic entity and relationship extraction from unstructured texts. Furthermore, the paper explores how transfer learning techniques can be applied to fine-tune large language models (LLaMA) for using to generate synthetic datasets to improve precision in BERT-based entity recognition and ontology alignment. The resulting IT Ontology (ITO) serves as a comprehensive knowledge base that integrates domain-specific insights from ITIL processes, enabling more efficient decision-making. Experimental results demonstrate significant improvements in knowledge extraction and relationship mapping, offering a cutting-edge solution for enhancing cognitive computing in IT service environments.Keywords: ontology mapping, R-GNN, knowledge extraction, large language models, NER, knowlege graph
Procedia PDF Downloads 16