Search results for: classification framework
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7022

Search results for: classification framework

5222 A Theoretical Framework for Conceptualizing Integration of Environmental Sustainability into Supplier Selection

Authors: Tonny Ograh, Joshua Ayarkwa, Dickson Osei-Asibey, Alex Acheampong, Peter Amoah

Abstract:

Theories are used to improve the conceptualization of research ideas. These theories enhance valuable elucidations that help us to grasp the meaning of research findings. Nevertheless, the use of theories to promote studies in green supplier selection in procurement decisions has attracted little attention. With the emergence of sustainable procurement, public procurement practitioners in Ghana are yet to achieve relevant knowledge on green supplier selections due to insufficient knowledge and inadequate appropriate frameworks. The flagrancy of the consequences of public procurers’ failure to integrate environmental considerations into supplier selection explains the adoption of a multi-theory approach for comprehension of the dynamics of green integration into supplier selection. In this paper, the practicality of three theories for improving the understanding of the influential factors enhancing the integration of environmental sustainability into supplier selection was reviewed. The three theories are Resource-Based Theory, Human Capital Theory and Absorptive Capacity Theory. This review uncovered knowledge management, top management commitment, and environmental management capabilities as important elements needed for the integration of environmental sustainability into supplier selection in public procurement. The theoretical review yielded a framework that conceptualizes knowledge and capabilities of practitioners relevant to the incorporation of environmental sustainability into supplier selection in public procurement.

Keywords: environmental, sustainability, supplier selection, environmental procurement, sustainable procurement

Procedia PDF Downloads 173
5221 Cross Attention Fusion for Dual-Stream Speech Emotion Recognition

Authors: Shaode Yu, Jiajian Meng, Bing Zhu, Hang Yu, Qiurui Sun

Abstract:

Speech emotion recognition (SER) is for recognizing human subjective emotions through audio data in-depth analysis. From speech audios, how to comprehensively extract emotional information and how to effectively fuse extracted features remain challenging. This paper presents a dual-stream SER framework that embraces both full training and transfer learning of different networks for thorough feature encoding. Besides, a plug-and-play cross-attention fusion (CAF) module is implemented for the valid integration of the dual-stream encoder output. The effectiveness of the proposed CAF module is compared to the other three fusion modules (feature summation, feature concatenation, and feature-wise linear modulation) on two databases (RAVDESS and IEMO-CAP) using different dual-stream encoders (full training network, DPCNN or TextRCNN; transfer learning network, HuBERT or Wav2Vec2). Experimental results suggest that the CAF module can effectively reconcile conflicts between features from different encoders and outperform the other three feature fusion modules on the SER task. In the future, the plug-and-play CAF module can be extended for multi-branch feature fusion, and the dual-stream SER framework can be widened for multi-stream data representation to improve the recognition performance and generalization capacity.

Keywords: speech emotion recognition, cross-attention fusion, dual-stream, pre-trained

Procedia PDF Downloads 70
5220 Abandoning 'One-Time' Optional Information Literacy Workshops for Year 1 Medical Students and Gearing towards an 'Embedded Librarianship' Approach

Authors: R. L. David, E. C. P. Tan, M. A. Ferenczi

Abstract:

This study aimed to investigate the effect of a 'one-time' optional Information Literacy (IL) workshop to enhance Year 1 medical students' literature search, writing, and citation management skills as directed by a customized five-year IL framework developed for LKC Medicine students. At the end of the IL workshop, the overall rated 'somewhat difficult' when finding, citing, and using information from sources. The study method is experimental using a standardized IL test to study the cohort effect of a 'one-time' optional IL workshop on Year 1 students; experimental group in comparison to Year 2 students; control group. Test scores from both groups were compared and analyzed using mean scores and one-way analysis of variance (ANOVA). Unexpectedly, there were no statistically significant differences between group means as determined by One-Way ANOVA (F₁,₁₉₃ = 3.37, p = 0.068, ηp² = 0.017). Challenges and shortfalls posed by 'one-time' interventions raised a rich discussion to adopt an 'embedded librarianship' approach, which shifts the medial librarians' role into the curriculum and uses Team Based Learning to teach IL skills to medical students. The customized five-year IL framework developed for LKC Medicine students becomes a useful librarian-faculty model for embedding and bringing IL into the classroom.

Keywords: information literacy, 'one-time' interventions, medical students, standardized tests, embedded librarianship, curriculum, medical librarians

Procedia PDF Downloads 113
5219 Constraints to Partnership Based Financing in Islamic Banks: A Systematic Review of Literature

Authors: Muhammad Nouman, Salim Gul, Karim Ullah

Abstract:

Partnership has been understood as the essence of Islamic banking. However, in practice, the non-partnership paradigm dominates the operations of Islamic banks. Islamic banks adopt partnership contracts for the scheme of deposits, especially for term deposit accounts. However, they do not adopt partnership contracts (i.e., Musharakah and Mudarabah) as the main financing scheme. In practice, non-partnership contracts including Murabahah and Ijara are widely used for financing. Many authors have provided different explanations for the less utilization of the partnership contracts as a scheme of financing. However, the typology of constraints remains missing. The extant literature remains scattered, with diverse studies focused on different dimensions of the issue. Therefore, there is no unified understanding of the constraints in the application of the partnership contracts. This paper aims to highlight the major factors hindering the application of partnership contracts, and produce a coherent view by synthesizing different explanations provided in several studies conducted around the globe. The present study employs insights form the extant literature using a systematic review and provides academia, practitioners, and policy makers with a holistic framework to name and make sense of what is making partnership contracts a less attractive option for Islamic banks. A total of 84 relevant publications including 11 books, 14 chapters of edited books, 48 journal articles, 8 conference papers and 3 IMF working papers were selected using a systematic procedure. Analysis of these selected publications followed three steps: i) In the first step of analysis the constraints explicitly appearing in the literature set of 84 articles were extracted, ii) In the second step 27 factors hindering the application of partnership contracts were identified from the constraints extracted in the first step with the overlapping items either eliminated or combined, iii) In the last step the factors identified in the second step were classified into three distinct categories. Our intention was to develop the typology of constraints by connecting the rather abstract concepts into the broader sets of constraints for better conceptualization and policy implications. Our framework highlights that there are mainly three facets of lower preference for partnership contracts of financing. First, there are several factors in the contemporary business settings, prevailing social setting, and the bank’s internal environment that underpin uncertainty in the success of partnership contracts of financing. Second, partnership contracts have lower demand i.e., entrepreneurs prefer to use non-partnership contracts for financing their ventures due to the inherent restraining characteristics of the partnership contracts. Finally, there are certain factors in the regulatory framework that restraint the extensive utilization of partnership contracts of financing by Islamic banks. The present study contributes to the Islamic banking literature in many ways. It provides clarification to the heavily criticized operations of Islamic banks, integrates the scattered literature, and provides a holistic framework for better conceptualization of the key constraints in the application of the partnership contracts and policy implications. Moreover, it demonstrates an application of systematic review in Islamic banking research.

Keywords: Islamic banking, Islamic finance, Mudarabah, Musharakah, partnership, systematic review

Procedia PDF Downloads 272
5218 The Institutional Change Occurring in the Chinese Sport Sector: A Case Study on the Chinese Football Association Reform

Authors: Qi Peng

Abstract:

The Chinese sport sector is currently undergoing a dramatic institutional change. A sport system that was heavily dominated by the government is starting to shift towards one that is driven by the market. During the past sixty years, the Chinese Football Association (CFA), although ostensibly a ‘non-governmental organization’, has been in fact operated under the close supervision and control of the government. The double-identity of CFA has taken most of the blame for the poor performance of the Chinese football team, especially the men’s team. In 2015, a policy initiated by the Chinese government introduced a potentially radical change to the institutional structure of CFA by delegating the power of government agency – the General Administration of Sport of China - to the organization (CFA) itself. Against such background, an overarching research question was brought up- will an organization remained institutionalized within the system change in response to the external (policy) jolt? To answer this question, three principal data collection methods were employed: document review, participant observation and semi-structured interviews. Document review provides the mapping of the structural and cultural framework in which the CFA functions during the change process. The author have had the chance to interact closely with the organization as participant observer in the organization for a period of time, long enough to collect the data, but never too long to get biased view of the situation. This stage enables the author to gain an in-depth understanding of how CFA managed to restructure the governance and legitimacy. Conducting semi-structured interviews with staff within the CFA and from staff within selected stakeholders of CFA also provided a crucial step to gain an insight into the factors for change as well as the implications of the change. A wide range of interviewees that have been and to be interviewed include: CFA members (senior officials and staff); local football associations members; senior Chinese Super League football club managers; CFA Super League Co., LTD (senior officials and staff); CSL broadcasters; Chinese Olympic Committee members. The preliminary research data shows that the CFA is currently undergoing two levels of change: although the settings of CFA has been gradually restructured (organizational framework), the organizational values and beliefs remain almost the same as the CFA before the reform. This means that the plan of shifting from a governmental agency to an autonomous association is an going process, and that organizational core beliefs and values are more difficult to change than its structural framework. This is due to the inertia of the organizational history and the effect of institutionalization. The change of Chinese Football Association is looked at as a pioneering sport organization in China to undertake the “decoupling” road. It is believed that many other sport organizations, especially sport governing bodies will follow the step of CFA in the near future. Therefore, the experience of CFA change is worthy of studying.

Keywords: Chinese Football Association, Organizational Change, Organizational Culture, Structural Framework

Procedia PDF Downloads 340
5217 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading

Authors: Robert Caulk

Abstract:

A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.

Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration

Procedia PDF Downloads 82
5216 Fuzzy Inference-Assisted Saliency-Aware Convolution Neural Networks for Multi-View Summarization

Authors: Tanveer Hussain, Khan Muhammad, Amin Ullah, Mi Young Lee, Sung Wook Baik

Abstract:

The Big Data generated from distributed vision sensors installed on large scale in smart cities create hurdles in its efficient and beneficial exploration for browsing, retrieval, and indexing. This paper presents a three-folded framework for effective video summarization of such data and provide a compact and representative format of Big Video Data. In the first fold, the paper acquires input video data from the installed cameras and collect clues such as type and count of objects and clarity of the view from a chunk of pre-defined number of frames of each view. The decision of representative view selection for a particular interval is based on fuzzy inference system, acquiring a precise and human resembling decision, reinforced by the known clues as a part of the second fold. In the third fold, the paper forwards the selected view frames to the summary generation mechanism that is supported by a saliency-aware convolution neural network (CNN) model. The new trend of fuzzy rules for view selection followed by CNN architecture for saliency computation makes the multi-view video summarization (MVS) framework a suitable candidate for real-world practice in smart cities.

Keywords: big video data analysis, fuzzy logic, multi-view video summarization, saliency detection

Procedia PDF Downloads 185
5215 Intra and International Collaborations as Important Factors of Organisational Innovation of Government Agencies in STI Ecosystem in ASEAN

Authors: Salinthip Thipayang, Achara Chandrachai, Rath Pichyangkura, Sukree Sinthupinyo

Abstract:

Most of the well-known frameworks and tools to measure and compare organisational innovation of the public or government agencies have been designed and used in the developed economies such as the EU, Nordic Region, Australia, and South Korea. This project is one of the very first attempts to develop a measurement tool to adequately measure the organisational (administrative) innovation of the government agencies in the developing economies in ASEAN. New measurement framework with the components including the intra and international collaborations of these government agencies to other private, public and academic sectors were added to the proposed measurement framework. Questionnaires and in-depth interviews with the experts and the middle to top executives of the participating public agencies in the ASEAN member states were conducted to determine the suitability and develop the indicators that should be included in the measurement model. The results showed that intra and international collaborations of these government organisations to other agencies in the public, private and academic sectors can lead to new changes and greatly impact the ways in which these government agencies in the ASEAN STI ecosystem are operated and administered. Government organisations in less developing countries in ASEAN are ready and willing to learn from their counterparts in other more advanced countries and adjust their internal management to be more innovative and to better handle international collaborative projects and commitments.

Keywords: organisational innovation, administrative innovation, government agencies, public agencies, ASEAN science technology and innovation ecosystem, international collaborations

Procedia PDF Downloads 382
5214 A Corpus-Based Study of Evaluative Language in Leading Articles in British Broadsheet and Tabloid Newspapers

Authors: Fatimah AlSaiari

Abstract:

In recent years, newspapers in the United Kingdom have been no longer just a means of sharing news about what happens in the world; they are also used to influence target readers by having them become more up-to-date, well-informed, entertained, exasperated, delighted, and infuriated. To achieve these objectives and maintain influence on public opinion, journalists use a particular language in which they can convey emotions and opinions, organize their discourse, and establish solidarity with their audience. This type of language has been widely analyzed under different labels, such as evaluation, appraisal, and stance. There is a considerable amount of linguistic and non-linguistic research devoted to analyzing this type of interpersonal language in journalistic discourse, and most of these studies were carried out to challenge the traditional assumptions of the objectivity and impartiality of news reporting. However, very little research has been undertaken on evaluative language in newspaper institutional editorials, and there is hardly any systematic or exhaustive analysis of this type of language in British tabloid and broadsheet newspapers. This study will attempt to provide new insights into the nature of authorial and non-authorial evaluation in leading articles in popular and quality British newspapers, along with their targets, sources, and discourse functions. The study will also attempt to develop a framework of evaluation that can be applied to evaluative lexical items in newspaper opinion texts. The framework is both theory-driven (i.e., it builds on and modifies previous frameworks of evaluation such as appraisal theory and parameter-based approach) and data-driven (i.e., it elicits the evaluative categories from the analysis of the corpus, which helps in the development of the current framework). To achieve this aim, a corpus of 140 leading articles were selected. The findings revealed that the tabloids tended to express their stance through explicitness, dramatization, frequent reference to social actors’ emotions and beliefs, and exaggeration in negativity, while the broadsheets preferred to express their stance through mitigation ambiguity and implicitness. conceptual themes and propositions were more preferable targets for expressing stance in the broadsheets while human behavior and characters were preferable targets for the tabloids.

Keywords: appraisal theory, evaluative language, British newspapers, broadsheets & tabloids, evaluative adjectives

Procedia PDF Downloads 288
5213 Framework for Aligning Supply Chain Strategies and Organizational Strategies in an SOE Environment

Authors: R. Setino, I. M. Ambe, J. A Badenhorst-Weiss

Abstract:

The South African government supply chain management system is not adequately implemented in State Owned Enterprises (SOEs). There are weaknesses in the SOEs SCM enablers, strategies and policies. In addition, top management of SOEs still do not see SCM as strategic enough to deserve their attention, and therefore, there is very little support from top management, thus making it even difficult for SCM practitioners to execute their day to day functions, let alone delivering the letter and spirit of the relevant legislations. Supply chain strategies lack buy in from the top, and as a result senior SCM practitioners has not been involved in the corporate strategy. This has resulted in supply chain and corporate strategies being misaligned. Due to service delivery backlog, high level of corruption and continuous strikes across the country for better services it is inevitable that government leaders be more strategic about how South Africa can use SCM as a tool to improve service delivery. Consequently, there is a need to close the gap between the strategic level dealt by top management and the application of operational SCM concepts: the use of SCM concepts and, therefore, supply chain strategies – should be aligned with the corporate and business strategies in order to ensure the achievement of top level business objectives. This paper aims to explore supply chain practices in State Owned Enterprises (SOEs). The paper based on a conceptual review provides the status, trends and development and suggests a framework for aligning supply chain strategies and organizational strategies in an SOE environment.

Keywords: alignment, strategies, state owned enterprises, supply chain management, South Africa

Procedia PDF Downloads 418
5212 Hybrid GNN Based Machine Learning Forecasting Model For Industrial IoT Applications

Authors: Atish Bagchi, Siva Chandrasekaran

Abstract:

Background: According to World Bank national accounts data, the estimated global manufacturing value-added output in 2020 was 13.74 trillion USD. These manufacturing processes are monitored, modelled, and controlled by advanced, real-time, computer-based systems, e.g., Industrial IoT, PLC, SCADA, etc. These systems measure and manipulate a set of physical variables, e.g., temperature, pressure, etc. Despite the use of IoT, SCADA etc., in manufacturing, studies suggest that unplanned downtime leads to economic losses of approximately 864 billion USD each year. Therefore, real-time, accurate detection, classification and prediction of machine behaviour are needed to minimise financial losses. Although vast literature exists on time-series data processing using machine learning, the challenges faced by the industries that lead to unplanned downtimes are: The current algorithms do not efficiently handle the high-volume streaming data from industrial IoTsensors and were tested on static and simulated datasets. While the existing algorithms can detect significant 'point' outliers, most do not handle contextual outliers (e.g., values within normal range but happening at an unexpected time of day) or subtle changes in machine behaviour. Machines are revamped periodically as part of planned maintenance programmes, which change the assumptions on which original AI models were created and trained. Aim: This research study aims to deliver a Graph Neural Network(GNN)based hybrid forecasting model that interfaces with the real-time machine control systemand can detect, predict machine behaviour and behavioural changes (anomalies) in real-time. This research will help manufacturing industries and utilities, e.g., water, electricity etc., reduce unplanned downtimes and consequential financial losses. Method: The data stored within a process control system, e.g., Industrial-IoT, Data Historian, is generally sampled during data acquisition from the sensor (source) and whenpersistingin the Data Historian to optimise storage and query performance. The sampling may inadvertently discard values that might contain subtle aspects of behavioural changes in machines. This research proposed a hybrid forecasting and classification model which combines the expressive and extrapolation capability of GNN enhanced with the estimates of entropy and spectral changes in the sampled data and additional temporal contexts to reconstruct the likely temporal trajectory of machine behavioural changes. The proposed real-time model belongs to the Deep Learning category of machine learning and interfaces with the sensors directly or through 'Process Data Historian', SCADA etc., to perform forecasting and classification tasks. Results: The model was interfaced with a Data Historianholding time-series data from 4flow sensors within a water treatment plantfor45 days. The recorded sampling interval for a sensor varied from 10 sec to 30 min. Approximately 65% of the available data was used for training the model, 20% for validation, and the rest for testing. The model identified the anomalies within the water treatment plant and predicted the plant's performance. These results were compared with the data reported by the plant SCADA-Historian system and the official data reported by the plant authorities. The model's accuracy was much higher (20%) than that reported by the SCADA-Historian system and matched the validated results declared by the plant auditors. Conclusions: The research demonstrates that a hybrid GNN based approach enhanced with entropy calculation and spectral information can effectively detect and predict a machine's behavioural changes. The model can interface with a plant's 'process control system' in real-time to perform forecasting and classification tasks to aid the asset management engineers to operate their machines more efficiently and reduce unplanned downtimes. A series of trialsare planned for this model in the future in other manufacturing industries.

Keywords: GNN, Entropy, anomaly detection, industrial time-series, AI, IoT, Industry 4.0, Machine Learning

Procedia PDF Downloads 142
5211 Automatic Target Recognition in SAR Images Based on Sparse Representation Technique

Authors: Ahmet Karagoz, Irfan Karagoz

Abstract:

Synthetic Aperture Radar (SAR) is a radar mechanism that can be integrated into manned and unmanned aerial vehicles to create high-resolution images in all weather conditions, regardless of day and night. In this study, SAR images of military vehicles with different azimuth and descent angles are pre-processed at the first stage. The main purpose here is to reduce the high speckle noise found in SAR images. For this, the Wiener adaptive filter, the mean filter, and the median filters are used to reduce the amount of speckle noise in the images without causing loss of data. During the image segmentation phase, pixel values are ordered so that the target vehicle region is separated from other regions containing unnecessary information. The target image is parsed with the brightest 20% pixel value of 255 and the other pixel values of 0. In addition, by using appropriate parameters of statistical region merging algorithm, segmentation comparison is performed. In the step of feature extraction, the feature vectors belonging to the vehicles are obtained by using Gabor filters with different orientation, frequency and angle values. A number of Gabor filters are created by changing the orientation, frequency and angle parameters of the Gabor filters to extract important features of the images that form the distinctive parts. Finally, images are classified by sparse representation method. In the study, l₁ norm analysis of sparse representation is used. A joint database of the feature vectors generated by the target images of military vehicle types is obtained side by side and this database is transformed into the matrix form. In order to classify the vehicles in a similar way, the test images of each vehicle is converted to the vector form and l₁ norm analysis of the sparse representation method is applied through the existing database matrix form. As a result, correct recognition has been performed by matching the target images of military vehicles with the test images by means of the sparse representation method. 97% classification success of SAR images of different military vehicle types is obtained.

Keywords: automatic target recognition, sparse representation, image classification, SAR images

Procedia PDF Downloads 361
5210 Regeneration of Geological Models Using Support Vector Machine Assisted by Principal Component Analysis

Authors: H. Jung, N. Kim, B. Kang, J. Choe

Abstract:

History matching is a crucial procedure for predicting reservoir performances and making future decisions. However, it is difficult due to uncertainties of initial reservoir models. Therefore, it is important to have reliable initial models for successful history matching of highly heterogeneous reservoirs such as channel reservoirs. In this paper, we proposed a novel scheme for regenerating geological models using support vector machine (SVM) and principal component analysis (PCA). First, we perform PCA for figuring out main geological characteristics of models. Through the procedure, permeability values of each model are transformed to new parameters by principal components, which have eigenvalues of large magnitude. Secondly, the parameters are projected into two-dimensional plane by multi-dimensional scaling (MDS) based on Euclidean distances. Finally, we train an SVM classifier using 20% models which show the most similar or dissimilar well oil production rates (WOPR) with the true values (10% for each). Then, the other 80% models are classified by trained SVM. We select models on side of low WOPR errors. One hundred channel reservoir models are initially generated by single normal equation simulation. By repeating the classification process, we can select models which have similar geological trend with the true reservoir model. The average field of the selected models is utilized as a probability map for regeneration. Newly generated models can preserve correct channel features and exclude wrong geological properties maintaining suitable uncertainty ranges. History matching with the initial models cannot provide trustworthy results. It fails to find out correct geological features of the true model. However, history matching with the regenerated ensemble offers reliable characterization results by figuring out proper channel trend. Furthermore, it gives dependable prediction of future performances with reduced uncertainties. We propose a novel classification scheme which integrates PCA, MDS, and SVM for regenerating reservoir models. The scheme can easily sort out reliable models which have similar channel trend with the reference in lowered dimension space.

Keywords: history matching, principal component analysis, reservoir modelling, support vector machine

Procedia PDF Downloads 157
5209 Exploring Methods and Strategies for Sustainable Urban Development

Authors: Klio Monokrousou, Maria Giannopoulou

Abstract:

Urban areas, as they have been developed and operate today, are areas of accumulation of a significant amount of people and a large number of activities that generate desires and reasons for traveling. The territorial expansion of the cities as well as the need to preserve the importance of the central city areas lead to the continuous increase of transportation needs which in the limited urban space results in creating serious traffic and operational problems. The modern perception of urban planning is directed towards more holistic approaches and integrated policies that make it economically competitive, socially just and more environmentally friendly. Over the last 25 years, the goal of sustainable transport development has been central to the agenda of any plan or policy for the city. The modern planning of urban space takes into account the economic and social aspects of the city and the importance of the environment to sustainable urban development. In this context, the European Union promotes direct or indirect related interventions according to the cohesion and environmental policies; many countries even had the chance to actually test them. This paper is part of a wider research still in progress and it explores the methods and processes that have been developed towards this direction and presents a review and systematic presentation of this work. The ultimate purpose of this research is to effectively use this review to create a decision making methodological framework which can be the basis of a useful operational tool for sustainable urban planning.

Keywords: methods, sustainable urban development, urban mobility, methodological framework

Procedia PDF Downloads 434
5208 Real-Time Visualization Using GPU-Accelerated Filtering of LiDAR Data

Authors: Sašo Pečnik, Borut Žalik

Abstract:

This paper presents a real-time visualization technique and filtering of classified LiDAR point clouds. The visualization is capable of displaying filtered information organized in layers by the classification attribute saved within LiDAR data sets. We explain the used data structure and data management, which enables real-time presentation of layered LiDAR data. Real-time visualization is achieved with LOD optimization based on the distance from the observer without loss of quality. The filtering process is done in two steps and is entirely executed on the GPU and implemented using programmable shaders.

Keywords: filtering, graphics, level-of-details, LiDAR, real-time visualization

Procedia PDF Downloads 302
5207 Re-Imagining and De-Constructing the Global Security Architecture

Authors: Smita Singh

Abstract:

The paper develops a critical framework to the hegemonic discourses resorted to by the dominant powers in the global security architecture. Within this framework, security is viewed as a discourse through which identities and threats are represented and produced to legitimize the security concerns of few at the cost of others. International security have long been driven and dominated by power relations. Since the end of the Cold War, the global transformations have triggered contestations to the idea of security at both theoretical and practical level. These widening and deepening of the concept of security have challenged the existing power hierarchies at the theoretical level but not altered the substance and actors defining it. When discourses are introduced into security studies, several critical questions erupt: how has power shaped security policies of the globe through language? How does one understand the meanings and impact of those discourses? Who decides the agenda, rules, players and outliers of the security? Language as a symbolic system and form of power is fluid and not fixed. Over the years the dominant Western powers, led by the United States of America have employed various discursive practices such as humanitarian intervention, responsibility to protect, non proliferation, human rights, war on terror and so on to reorient the constitution of identities and interests and hence the policies that need to be adopted for its actualization. These power relations are illustrated in this paper through the narratives used in the nonproliferation regime. The hierarchical security dynamics is a manifestation of the global power relations driven by many factors including discourses.

Keywords: hegemonic discourse, global security, non-proliferation regime, power politics

Procedia PDF Downloads 316
5206 Comparison of Quality Indices for Sediment Assessment in Ireland

Authors: Tayyaba Bibi, Jenny Ronan, Robert Hernan, Kathleen O’Rourke, Brendan McHugh, Evin McGovern, Michelle Giltrap, Gordon Chambers, James Wilson

Abstract:

Sediment contamination is a major source of ecosystem stress and has received significant attention from the scientific community. Both the Water Framework Directive (WFD) and Marine Strategy Framework Directive (MSFD) require a robust set of tools for biological and chemical monitoring. For the MSFD in particular, causal links between contaminant and effects need to be assessed. Appropriate assessment tools are required in order to make an accurate evaluation. In this study, a range of recommended sediment bioassays and chemical measurements are assessed in a number of potentially impacted and lowly impacted locations around Ireland. Previously, assessment indices have been developed on individual compartments, i.e. contaminant levels or biomarker/bioassay responses. A number of assessment indices are applied to chemical and ecotoxicological data from the Seachange project (Project code) and compared including the metal pollution index (MPI), pollution load index (PLI) and Chapman index for chemistry as well as integrated biomarker response (IBR). The benefits and drawbacks of the use of indices and aggregation techniques are discussed. In addition to this, modelling of raw data is investigated to analyse links between contaminant and effects.

Keywords: bioassays, contamination indices, ecotoxicity, marine environment, sediments

Procedia PDF Downloads 224
5205 Sustainable Business Model Archetypes – A Systematic Review and Application to the Plastic Industry

Authors: Felix Schumann, Giorgia Carratta, Tobias Dauth, Liv Jaeckel

Abstract:

In the last few decades, the rapid growth of the use and disposal of plastic items has led to their overaccumulation in the environment. As a result, plastic pollution has become a subject of global concern. Today plastics are used as raw materials in almost every industry. While the recognition of the ecological, social, and economic impact of plastics in academic research is on the rise, the potential role of the ‘plastic industry’ in dealing with such issues is still largely underestimated. Therefore, the literature on sustainable plastic management is still nascent and fragmented. Working towards sustainability requires a fundamental shift in the way companies employ plastics in their day-to-day business. For that reason, the applicability of the business model concept has recently gained momentum in environmental research. Business model innovation is increasingly recognized as an important driver to re-conceptualize the purpose of the firm and to readily integrate sustainability in their business. It can serve as a starting point to investigate whether and how sustainability can be realized under industry- and firm-specific circumstances. Yet, there is no comprehensive view in the plastic industry on how firms start refining their business models to embed sustainability in their operations. Our study addresses this gap, looking primarily at the industrial sectors responsible for the production of the largest amount of plastic waste today: plastic packaging, consumer goods, construction, textile, and transport. Relying on the archetypes of sustainable business models and applying them to the aforementioned sectors, we try to identify companies’ current strategies to make their business models more sustainable. Based on the thematic clustering, we can develop an integrative framework for the plastic industry. The findings are underpinned and illustrated by a variety of relevant plastic management solutions that the authors have identified through a systematic literature review and analysis of existing, empirically grounded research in this field. Using the archetypes, we can promote options for business model innovations for the most important sectors in which plastics are used. Moreover, by linking the proposed business model archetypes to the plastic industry, our research approach guides firms in exploring sustainable business opportunities. Likewise, researchers and policymakers can utilize our classification to identify best practices. The authors believe that the study advances the current knowledge on sustainable plastic management through its broad empirical industry analyses. Hence, the application of business model archetypes in the plastic industry will be useful for shaping companies’ transformation to create and deliver more sustainability and provides avenues for future research endeavors.

Keywords: business models, environmental economics, plastic management, plastic pollution, sustainability

Procedia PDF Downloads 95
5204 Environmental Benefits of Corn Cob Ash in Lateritic Soil Cement Stabilization for Road Works in a Sub-Tropical Region

Authors: Ahmed O. Apampa, Yinusa A. Jimoh

Abstract:

The potential economic viability and environmental benefits of using a biomass waste, such as corn cob ash (CCA) as pozzolan in stabilizing soils for road pavement construction in a sub-tropical region was investigated. Corn cob was obtained from Maya in South West Nigeria and processed to ash of characteristics similar to Class C Fly Ash pozzolan as specified in ASTM C618-12. This was then blended with ordinary Portland cement in the CCA:OPC ratios of 1:1, 1:2 and 2:1. Each of these blends was then mixed with lateritic soil of ASHTO classification A-2-6(3) in varying percentages from 0 – 7.5% at 1.5% intervals. The soil-CCA-Cement mixtures were thereafter tested for geotechnical index properties including the BS Proctor Compaction, California Bearing Ratio (CBR) and the Unconfined Compression Strength Test. The tests were repeated for soil-cement mix without any CCA blending. The cost of the binder inputs and optimal blends of CCA:OPC in the stabilized soil were thereafter analyzed by developing algorithms that relate the experimental data on strength parameters (Unconfined Compression Strength, UCS and California Bearing Ratio, CBR) with the bivariate independent variables CCA and OPC content, using Matlab R2011b. An optimization problem was then set up minimizing the cost of chemical stabilization of laterite with CCA and OPC, subject to the constraints of minimum strength specifications. The Evolutionary Engine as well as the Generalized Reduced Gradient option of the Solver of MS Excel 2010 were used separately on the cells to obtain the optimal blend of CCA:OPC. The optimal blend attaining the required strength of 1800 kN/m2 was determined for the 1:2 CCA:OPC as 5.4% mix (OPC content 3.6%) compared with 4.2% for the OPC only option; and as 6.2% mix for the 1:1 blend (OPC content 3%). The 2:1 blend did not attain the required strength, though over a 100% gain in UCS value was obtained over the control sample with 0% binder. Upon the fact that 0.97 tonne of CO2 is released for every tonne of cement used (OEE, 2001), the reduced OPC requirement to attain the same result indicates the possibility of reducing the net CO2 contribution of the construction industry to the environment ranging from 14 – 28.5% if CCA:OPC blends are widely used in soil stabilization, going by the results of this study. The paper concludes by recommending that Nigeria and other developing countries in the sub-tropics with abundant stock of biomass waste should look in the direction of intensifying the use of biomass waste as fuel and the derived ash for the production of pozzolans for road-works, thereby reducing overall green house gas emissions and in compliance with the objectives of the United Nations Framework on Climate Change.

Keywords: corn cob ash, biomass waste, lateritic soil, unconfined compression strength, CO2 emission

Procedia PDF Downloads 371
5203 Enhancing Residential Architecture through Generative Design: Balancing Aesthetics, Legal Constraints, and Environmental Considerations

Authors: Milena Nanova, Radul Shishkov, Damyan Damov, Martin Georgiev

Abstract:

This research paper presents an in-depth exploration of the use of generative design in urban residential architecture, with a dual focus on aligning aesthetic values with legal and environmental constraints. The study aims to demonstrate how generative design methodologies can innovate residential building designs that are not only legally compliant and environmentally conscious but also aesthetically compelling. At the core of our research is a specially developed generative design framework tailored for urban residential settings. This framework employs computational algorithms to produce diverse design solutions, meticulously balancing aesthetic appeal with practical considerations. By integrating site-specific features, urban legal restrictions, and environmental factors, our approach generates designs that resonate with the unique character of urban landscapes while adhering to regulatory frameworks. The paper places emphasis on algorithmic implementation of the logical constraint and intricacies in residential architecture by exploring the potential of generative design to create visually engaging and contextually harmonious structures. This exploration also contains an analysis of how these designs align with legal building parameters, showcasing the potential for creative solutions within the confines of urban building regulations. Concurrently, our methodology integrates functional, economic, and environmental factors. We investigate how generative design can be utilized to optimize buildings' performance, considering them, aiming to achieve a symbiotic relationship between the built environment and its natural surroundings. Through a blend of theoretical research and practical case studies, this research highlights the multifaceted capabilities of generative design and demonstrates practical applications of our framework. Our findings illustrate the rich possibilities that arise from an algorithmic design approach in the context of a vibrant urban landscape. This study contributes an alternative perspective to residential architecture, suggesting that the future of urban development lies in embracing the complex interplay between computational design innovation, regulatory adherence, and environmental responsibility.

Keywords: generative design, computational design, parametric design, algorithmic modeling

Procedia PDF Downloads 54
5202 Use of Fractal Geometry in Machine Learning

Authors: Fuad M. Alkoot

Abstract:

The main component of a machine learning system is the classifier. Classifiers are mathematical models that can perform classification tasks for a specific application area. Additionally, many classifiers are combined using any of the available methods to reduce the classifier error rate. The benefits gained from the combination of multiple classifier designs has motivated the development of diverse approaches to multiple classifiers. We aim to investigate using fractal geometry to develop an improved classifier combiner. Initially we experiment with measuring the fractal dimension of data and use the results in the development of a combiner strategy.

Keywords: fractal geometry, machine learning, classifier, fractal dimension

Procedia PDF Downloads 210
5201 Towards a Deeper Understanding of 21st Century Global Terrorism

Authors: Francis Jegede

Abstract:

This paper examines essential issues relating to the rise and nature of violent extremism involving non-state actors and groups in the early 21st century. The global trends in terrorism and violent extremism are examined in relation to Western governments’ counter terror operations. The paper analyses the existing legal framework for fighting violent extremism and terrorism and highlights the inherent limitations of the current International Law of War in dealing with the growing challenges posed by terrorists and violent extremist groups. The paper discusses how terrorist groups use civilians, women and children as tools and weapon of war to fuel their campaign of terror and suggests ways in which the international community could deal with the challenge of fighting terrorist groups without putting civilians, women and children in harm way. The paper emphasises the need to uphold human rights values and respect for the law of war in our response to global terrorism. The paper poses the question as to whether the current legal framework for dealing with terrorist groups is sufficient without contravening the essential provisions and ethos of the International Law of War and Human Rights. While the paper explains how terrorist groups flagrantly disregard the rule of law and disrespect human rights in their campaign of terror, it also notes instances in which the current Western strategy in fighting terrorism may be viewed or considered as conflicting with human rights and international law.

Keywords: terrorism, law of war, international law, violent extremism

Procedia PDF Downloads 317
5200 Discovering Event Outliers for Drug as Commercial Products

Authors: Arunas Burinskas, Aurelija Burinskiene

Abstract:

On average, ten percent of drugs - commercial products are not available in pharmacies due to shortage. The shortage event disbalance sales and requires a recovery period, which is too long. Therefore, one of the critical issues that pharmacies do not record potential sales transactions during shortage and recovery periods. The authors suggest estimating outliers during shortage and recovery periods. To shorten the recovery period, the authors suggest using average sales per sales day prediction, which helps to protect the data from being downwards or upwards. Authors use the outlier’s visualization method across different drugs and apply the Grubbs test for significance evaluation. The researched sample is 100 drugs in a one-month time frame. The authors detected that high demand variability products had outliers. Among analyzed drugs, which are commercial products i) High demand variability drugs have a one-week shortage period, and the probability of facing a shortage is equal to 69.23%. ii) Mid demand variability drugs have three days shortage period, and the likelihood to fall into deficit is equal to 34.62%. To avoid shortage events and minimize the recovery period, real data must be set up. Even though there are some outlier detection methods for drug data cleaning, they have not been used for the minimization of recovery period once a shortage has occurred. The authors use Grubbs’ test real-life data cleaning method for outliers’ adjustment. In the paper, the outliers’ adjustment method is applied with a confidence level of 99%. In practice, the Grubbs’ test was used to detect outliers for cancer drugs and reported positive results. The application of the Grubbs’ test is used to detect outliers which exceed boundaries of normal distribution. The result is a probability that indicates the core data of actual sales. The application of the outliers’ test method helps to represent the difference of the mean of the sample and the most extreme data considering the standard deviation. The test detects one outlier at a time with different probabilities from a data set with an assumed normal distribution. Based on approximation data, the authors constructed a framework for scaling potential sales and estimating outliers with Grubbs’ test method. The suggested framework is applicable during the shortage event and recovery periods. The proposed framework has practical value and could be used for the minimization of the recovery period required after the shortage of event occurrence.

Keywords: drugs, Grubbs' test, outlier, shortage event

Procedia PDF Downloads 129
5199 The Notion of International Criminal Law: Between Criminal Aspects of International Law and International Aspects of Criminal Law

Authors: Magda Olesiuk-Okomska

Abstract:

Although international criminal law has grown significantly in the last decades, it still remains fragmented and lacks doctrinal cohesiveness. Its concept is described in the doctrine as highly disputable. There is no concrete definition of the term. In the domestic doctrine, the problem of criminal law issues that arise in the international setting, and international issues that arise within the national criminal law, is underdeveloped both theoretically and practically. To the best of author’s knowledge, there are no studies describing international aspects of criminal law in a comprehensive manner, taking a more expansive view of the subject. This paper presents results of a part of the doctoral research, undertaking a theoretical framework of the international criminal law. It aims at sorting out the existing terminology on international aspects of criminal law. It demonstrates differences between the notions of international criminal law, criminal law international and law international criminal. It confronts the notion of criminal law with related disciplines and shows their interplay. It specifies the scope of international criminal law. It diagnoses the current legal framework of international aspects of criminal law, referring to both criminal law issues that arise in the international setting, and international issues that arise in the context of national criminal law. Finally, de lege lata postulates were formulated and direction of changes in international criminal law was proposed. The adopted research hypothesis assumed that the notion of international criminal law was inconsistent, not understood uniformly, and there was no conformity as to its place within the system of law, objective and subjective scopes, while the domestic doctrine did not correspond with international standards and differed from the worldwide doctrine. Implemented research methods included inter alia a dogmatic and legal method, an analytical method, a comparative method, as well as desk research.

Keywords: criminal law, international crimes, international criminal law, international law

Procedia PDF Downloads 298
5198 Arabic Handwriting Recognition Using Local Approach

Authors: Mohammed Arif, Abdessalam Kifouche

Abstract:

Optical character recognition (OCR) has a main role in the present time. It's capable to solve many serious problems and simplify human activities. The OCR yields to 70's, since many solutions has been proposed, but unfortunately, it was supportive to nothing but Latin languages. This work proposes a system of recognition of an off-line Arabic handwriting. This system is based on a structural segmentation method and uses support vector machines (SVM) in the classification phase. We have presented a state of art of the characters segmentation methods, after that a view of the OCR area, also we will address the normalization problems we went through. After a comparison between the Arabic handwritten characters & the segmentation methods, we had introduced a contribution through a segmentation algorithm.

Keywords: OCR, segmentation, Arabic characters, PAW, post-processing, SVM

Procedia PDF Downloads 62
5197 Hybrid Knowledge Approach for Determining Health Care Provider Specialty from Patient Diagnoses

Authors: Erin Lynne Plettenberg, Jeremy Vickery

Abstract:

In an access-control situation, the role of a user determines whether a data request is appropriate. This paper combines vetted web mining and logic modeling to build a lightweight system for determining the role of a health care provider based only on their prior authorized requests. The model identifies provider roles with 100% recall from very little data. This shows the value of vetted web mining in AI systems, and suggests the impact of the ICD classification on medical practice.

Keywords: electronic medical records, information extraction, logic modeling, ontology, vetted web mining

Procedia PDF Downloads 170
5196 Transformers in Gene Expression-Based Classification

Authors: Babak Forouraghi

Abstract:

A genetic circuit is a collection of interacting genes and proteins that enable individual cells to implement and perform vital biological functions such as cell division, growth, death, and signaling. In cell engineering, synthetic gene circuits are engineered networks of genes specifically designed to implement functionalities that are not evolved by nature. These engineered networks enable scientists to tackle complex problems such as engineering cells to produce therapeutics within the patient's body, altering T cells to target cancer-related antigens for treatment, improving antibody production using engineered cells, tissue engineering, and production of genetically modified plants and livestock. Construction of computational models to realize genetic circuits is an especially challenging task since it requires the discovery of flow of genetic information in complex biological systems. Building synthetic biological models is also a time-consuming process with relatively low prediction accuracy for highly complex genetic circuits. The primary goal of this study was to investigate the utility of a pre-trained bidirectional encoder transformer that can accurately predict gene expressions in genetic circuit designs. The main reason behind using transformers is their innate ability (attention mechanism) to take account of the semantic context present in long DNA chains that are heavily dependent on spatial representation of their constituent genes. Previous approaches to gene circuit design, such as CNN and RNN architectures, are unable to capture semantic dependencies in long contexts as required in most real-world applications of synthetic biology. For instance, RNN models (LSTM, GRU), although able to learn long-term dependencies, greatly suffer from vanishing gradient and low-efficiency problem when they sequentially process past states and compresses contextual information into a bottleneck with long input sequences. In other words, these architectures are not equipped with the necessary attention mechanisms to follow a long chain of genes with thousands of tokens. To address the above-mentioned limitations of previous approaches, a transformer model was built in this work as a variation to the existing DNA Bidirectional Encoder Representations from Transformers (DNABERT) model. It is shown that the proposed transformer is capable of capturing contextual information from long input sequences with attention mechanism. In a previous work on genetic circuit design, the traditional approaches to classification and regression, such as Random Forrest, Support Vector Machine, and Artificial Neural Networks, were able to achieve reasonably high R2 accuracy levels of 0.95 to 0.97. However, the transformer model utilized in this work with its attention-based mechanism, was able to achieve a perfect accuracy level of 100%. Further, it is demonstrated that the efficiency of the transformer-based gene expression classifier is not dependent on presence of large amounts of training examples, which may be difficult to compile in many real-world gene circuit designs.

Keywords: transformers, generative ai, gene expression design, classification

Procedia PDF Downloads 57
5195 Software Architectural Design Ontology

Authors: Muhammad Irfan Marwat, Sadaqat Jan, Syed Zafar Ali Shah

Abstract:

Software architecture plays a key role in software development but absence of formal description of software architecture causes different impede in software development. To cope with these difficulties, ontology has been used as artifact. This paper proposes ontology for software architectural design based on IEEE model for architecture description and Kruchten 4+1 model for viewpoints classification. For categorization of style and views, ISO/IEC 42010 has been used. Corpus method has been used to evaluate ontology. The main aim of the proposed ontology is to classify and locate software architectural design information.

Keywords: semantic-based software architecture, software architecture, ontology, software engineering

Procedia PDF Downloads 538
5194 Infodemic Detection on Social Media with a Multi-Dimensional Deep Learning Framework

Authors: Raymond Xu, Cindy Jingru Wang

Abstract:

Social media has become a globally connected and influencing platform. Social media data, such as tweets, can help predict the spread of pandemics and provide individuals and healthcare providers early warnings. Public psychological reactions and opinions can be efficiently monitored by AI models on the progression of dominant topics on Twitter. However, statistics show that as the coronavirus spreads, so does an infodemic of misinformation due to pandemic-related factors such as unemployment and lockdowns. Social media algorithms are often biased toward outrage by promoting content that people have an emotional reaction to and are likely to engage with. This can influence users’ attitudes and cause confusion. Therefore, social media is a double-edged sword. Combating fake news and biased content has become one of the essential tasks. This research analyzes the variety of methods used for fake news detection covering random forest, logistic regression, support vector machines, decision tree, naive Bayes, BoW, TF-IDF, LDA, CNN, RNN, LSTM, DeepFake, and hierarchical attention network. The performance of each method is analyzed. Based on these models’ achievements and limitations, a multi-dimensional AI framework is proposed to achieve higher accuracy in infodemic detection, especially pandemic-related news. The model is trained on contextual content, images, and news metadata.

Keywords: artificial intelligence, fake news detection, infodemic detection, image recognition, sentiment analysis

Procedia PDF Downloads 245
5193 Automatic Differential Diagnosis of Melanocytic Skin Tumours Using Ultrasound and Spectrophotometric Data

Authors: Kristina Sakalauskiene, Renaldas Raisutis, Gintare Linkeviciute, Skaidra Valiukeviciene

Abstract:

Cutaneous melanoma is a melanocytic skin tumour, which has a very poor prognosis while is highly resistant to treatment and tends to metastasize. Thickness of melanoma is one of the most important biomarker for stage of disease, prognosis and surgery planning. In this study, we hypothesized that the automatic analysis of spectrophotometric images and high-frequency ultrasonic 2D data can improve differential diagnosis of cutaneous melanoma and provide additional information about tumour penetration depth. This paper presents the novel complex automatic system for non-invasive melanocytic skin tumour differential diagnosis and penetration depth evaluation. The system is composed of region of interest segmentation in spectrophotometric images and high-frequency ultrasound data, quantitative parameter evaluation, informative feature extraction and classification with linear regression classifier. The segmentation of melanocytic skin tumour region in ultrasound image is based on parametric integrated backscattering coefficient calculation. The segmentation of optical image is based on Otsu thresholding. In total 29 quantitative tissue characterization parameters were evaluated by using ultrasound data (11 acoustical, 4 shape and 15 textural parameters) and 55 quantitative features of dermatoscopic and spectrophotometric images (using total melanin, dermal melanin, blood and collagen SIAgraphs acquired using spectrophotometric imaging device SIAscope). In total 102 melanocytic skin lesions (including 43 cutaneous melanomas) were examined by using SIAscope and ultrasound system with 22 MHz center frequency single element transducer. The diagnosis and Breslow thickness (pT) of each MST were evaluated during routine histological examination after excision and used as a reference. The results of this study have shown that automatic analysis of spectrophotometric and high frequency ultrasound data can improve non-invasive classification accuracy of early-stage cutaneous melanoma and provide supplementary information about tumour penetration depth.

Keywords: cutaneous melanoma, differential diagnosis, high-frequency ultrasound, melanocytic skin tumours, spectrophotometric imaging

Procedia PDF Downloads 267