Search results for: query complexity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1780

Search results for: query complexity

1390 An Overview of the Porosity Classification in Carbonate Reservoirs and Their Challenges: An Example of Macro-Microporosity Classification from Offshore Miocene Carbonate in Central Luconia, Malaysia

Authors: Hammad T. Janjuhah, Josep Sanjuan, Mohamed K. Salah

Abstract:

Biological and chemical activities in carbonates are responsible for the complexity of the pore system. Primary porosity is generally of natural origin while secondary porosity is subject to chemical reactivity through diagenetic processes. To understand the integrated part of hydrocarbon exploration, it is necessary to understand the carbonate pore system. However, the current porosity classification scheme is limited to adequately predict the petrophysical properties of different reservoirs having various origins and depositional environments. Rock classification provides a descriptive method for explaining the lithofacies but makes no significant contribution to the application of porosity and permeability (poro-perm) correlation. The Central Luconia carbonate system (Malaysia) represents a good example of pore complexity (in terms of nature and origin) mainly related to diagenetic processes which have altered the original reservoir. For quantitative analysis, 32 high-resolution images of each thin section were taken using transmitted light microscopy. The quantification of grains, matrix, cement, and macroporosity (pore types) was achieved using a petrographic analysis of thin sections and FESEM images. The point counting technique was used to estimate the amount of macroporosity from thin section, which was then subtracted from the total porosity to derive the microporosity. The quantitative observation of thin sections revealed that the mouldic porosity (macroporosity) is the dominant porosity type present, whereas the microporosity seems to correspond to a sum of 40 to 50% of the total porosity. It has been proven that these Miocene carbonates contain a significant amount of microporosity, which significantly complicates the estimation and production of hydrocarbons. Neglecting its impact can increase uncertainty about estimating hydrocarbon reserves. Due to the diversity of geological parameters, the application of existing porosity classifications does not allow a better understanding of the poro-perm relationship. However, the classification can be improved by including the pore types and pore structures where they can be divided into macro- and microporosity. Such studies of microporosity identification/classification represent now a major concern in limestone reservoirs around the world.

Keywords: overview of porosity classification, reservoir characterization, microporosity, carbonate reservoir

Procedia PDF Downloads 125
1389 Biomimetic Systems to Reveal the Action Mode of Epigallocatechin-3-Gallate in Lipid Membrane

Authors: F. Pires, V. Geraldo, O. N. Oliveira Jr., M. Raposo

Abstract:

Catechins are powerful antioxidants which have attractive properties useful for tumor therapy. Considering their antioxidant activity, these molecules can act as a scavenger of the reactive oxygen species (ROS), alleviating the damage of cell membrane induced by oxidative stress. The complexity and dynamic nature of the cell membrane compromise the analysis of the biophysical interactions between drug and cell membrane and restricts the transport or uptake of the drug by intracellular targets. To avoid the cell membrane complexity, we used biomimetic systems as liposomes and Langmuir monolayers to study the interaction between catechin and membranes at the molecular level. Liposomes were formed after the dispersion of anionic 1,2-dipalmitoyl-sn-glycero-3-[phospho-rac-(1-glycerol)(sodium salt) (DPPG) phospholipids in an aqueous solution, which mimic the arrangement of lipids in natural cell membranes and allows the entrapment of catechins. Langmuir monolayers were formed after dropping amphiphilic molecules, DPPG phospholipids, dissolved in an organic solvent onto the water surface. In this work, we mixed epigallocatechin-3-gallate (EGCG) with DPPG liposomes and exposed them to ultra-violet radiation in order to evaluate the antioxidant potential of these molecules against oxidative stress induced by radiation. The presence of EGCG in the mixture decreased the rate of lipid peroxidation, proving that EGCG protects membranes through the quenching of the reactive oxygen species. Considering the high amount of hydroxyl groups (OH groups) on structure of EGCG, a possible mechanism to these molecules interact with membrane is through hydrogen bonding. We also investigated the effect of EGCG at various concentrations on DPPG Langmuir monolayers. The surface pressure isotherms and infrared reflection-absorption spectroscopy (PM-IRRAS) results corroborate with absorbance results preformed on liposome-model, showing that EGCG interacts with polar heads of the monolayers. This study elucidates the physiological action of EGCG which can be incorporated in lipid membrane. These results are also relevant for the improvement of the current protocols used to incorporate catechins in drug delivery systems.

Keywords: catechins, lipid membrane, anticancer agent, molecular interactions

Procedia PDF Downloads 208
1388 Software Architecture Optimization Using Swarm Intelligence Techniques

Authors: Arslan Ellahi, Syed Amjad Hussain, Fawaz Saleem Bokhari

Abstract:

Optimization of software architecture can be done with respect to a quality attributes (QA). In this paper, there is an analysis of multiple research papers from different dimensions that have been used to classify those attributes. We have proposed a technique of swarm intelligence Meta heuristic ant colony optimization algorithm as a contribution to solve this critical optimization problem of software architecture. We have ranked quality attributes and run our algorithm on every QA, and then we will rank those on the basis of accuracy. At the end, we have selected the most accurate quality attributes. Ant colony algorithm is an effective algorithm and will perform best in optimizing the QA’s and ranking them.

Keywords: complexity, rapid evolution, swarm intelligence, dimensions

Procedia PDF Downloads 233
1387 Lacunarity measures on Mammographic Image Applying Fractal Dimension and Lacunarity Measures

Authors: S. Sushma, S. Balasubramanian, K. C. Latha, R. Sridhar

Abstract:

Structural texture measures are used to address the aspect of breast cancer risk assessment in screening mammograms. The current study investigates whether texture properties characterized by local Fractal Dimension (FD) and lacunarity contribute to assess breast cancer risk. Fractal Dimension represents the complexity while the lacunarity characterize the gap of a fractal dimension. In this paper, we present our result confirming that the lacunarity value resulted in algorithm using mammogram images states that level of lacunarity will be low when the Fractal Dimension value will be high.

Keywords: breast cancer, fractal dimension, image analysis, lacunarity, mammogram

Procedia PDF Downloads 360
1386 European Countries Challenge’s in Value Added Tax

Authors: Fatbardha Kadiu, Nulifer Caliskan

Abstract:

The value added tax came as a necessity of substituting the old tax on sales. Based on the advantages of this new tax in our days it is used successfully in more than 140 countries around the world. The aim of the paper is to describe the nature of this tax with its advantages and disadvantages. Also it will describe the way how it functions in most of the European countries and the actual challenges of these countries on value added tax. It will be present the types of goods which are exempt from this tax, the reasons and the consequences of those exemptions. The paper will be based on secondary data taken from respective literature. An econometric model will be present in order to identify the dependence of value tax from other parameters. The analyzing most refers to the two main principles of harmonization and billing on the fiscal system and the ways how to restructures the system in order to minimize the fiscal evasion.

Keywords: value added tax, revenues, complexity, legal uncertainty

Procedia PDF Downloads 377
1385 Visual Impairment Through Contextualized Lived Experiences: The Story of James

Authors: Jentel Van Havermaet, Geert Van Hove, Elisabeth De Schauwer

Abstract:

This study re-conceptualizes visual impairment in the interdependent context of James, his family, and allies. Living with a visual impairment is understood as an entanglement of assemblages, dynamics, disablism, systems… We narrated this diffractively into two meaningful events: decisions and processes on (inclusive) education and hinderances in connecting with others. We entangled and (un)raveled lived experiences in assemblages in which the contextualized meaning of visual impairment became more clearly. The contextualized narrative of James interwove complex intra-actions; showed the complexity and contextualization of entangled relationalities.

Keywords: disability studies, contextualization, visual impairment, assemblage, entanglement, lived experiences

Procedia PDF Downloads 147
1384 Selecting the Best Sub-Region Indexing the Images in the Case of Weak Segmentation Based on Local Color Histograms

Authors: Mawloud Mosbah, Bachir Boucheham

Abstract:

Color Histogram is considered as the oldest method used by CBIR systems for indexing images. In turn, the global histograms do not include the spatial information; this is why the other techniques coming later have attempted to encounter this limitation by involving the segmentation task as a preprocessing step. The weak segmentation is employed by the local histograms while other methods as CCV (Color Coherent Vector) are based on strong segmentation. The indexation based on local histograms consists of splitting the image into N overlapping blocks or sub-regions, and then the histogram of each block is computed. The dissimilarity between two images is reduced, as consequence, to compute the distance between the N local histograms of the both images resulting then in N*N values; generally, the lowest value is taken into account to rank images, that means that the lowest value is that which helps to designate which sub-region utilized to index images of the collection being asked. In this paper, we make under light the local histogram indexation method in the hope to compare the results obtained against those given by the global histogram. We address also another noteworthy issue when Relying on local histograms namely which value, among N*N values, to trust on when comparing images, in other words, which sub-region among the N*N sub-regions on which we base to index images. Based on the results achieved here, it seems that relying on the local histograms, which needs to pose an extra overhead on the system by involving another preprocessing step naming segmentation, does not necessary mean that it produces better results. In addition to that, we have proposed here some ideas to select the local histogram on which we rely on to encode the image rather than relying on the local histogram having lowest distance with the query histograms.

Keywords: CBIR, color global histogram, color local histogram, weak segmentation, Euclidean distance

Procedia PDF Downloads 338
1383 A Survey on Concurrency Control Methods in Distributed Database

Authors: Seyed Mohsen Jameii

Abstract:

In the last years, remarkable improvements have been made in the ability of distributed database systems performance. A distributed database is composed of some sites which are connected to each other through network connections. In this system, if good harmonization is not made between different transactions, it may result in database incoherence. Nowadays, because of the complexity of many sites and their connection methods, it is difficult to extend different models in distributed database serially. The principle goal of concurrency control in distributed database is to ensure not interfering in accessibility of common database by different sites. Different concurrency control algorithms have been suggested to use in distributed database systems. In this paper, some available methods have been introduced and compared for concurrency control in distributed database.

Keywords: distributed database, two phase locking protocol, transaction, concurrency

Procedia PDF Downloads 325
1382 Query in Grammatical Forms and Corpus Error Analysis

Authors: Katerina Florou

Abstract:

Two decades after coined the term "learner corpora" as collections of texts created by foreign or second language learners across various language contexts, and some years following suggestion to incorporate "focusing on form" within a Task-Based Learning framework, this study aims to explore how learner corpora, whether annotated with errors or not, can facilitate a focus on form in an educational setting. Argues that analyzing linguistic form serves the purpose of enabling students to delve into language and gain an understanding of different facets of the foreign language. This same objective is applicable when analyzing learner corpora marked with errors or in their raw state, but in this scenario, the emphasis lies on identifying incorrect forms. Teachers should aim to address errors or gaps in the students' second language knowledge while they engage in a task. Building on this recommendation, we compared the written output of two student groups: the first group (G1) employed the focusing on form phase by studying a specific aspect of the Italian language, namely the past participle, through examples from native speakers and grammar rules; the second group (G2) focused on form by scrutinizing their own errors and comparing them with analogous examples from a native speaker corpus. In order to test our hypothesis, we created four learner corpora. The initial two were generated during the task phase, with one representing each group of students, while the remaining two were produced as a follow-up activity at the end of the lesson. The results of the first comparison indicated that students' exposure to their own errors can enhance their grasp of a grammatical element. The study is in its second stage and more results are to be announced.

Keywords: Corpus interlanguage analysis, task based learning, Italian language as F1, learner corpora

Procedia PDF Downloads 25
1381 Reverse Logistics Information Management Using Ontological Approach

Authors: F. Lhafiane, A. Elbyed, M. Bouchoum

Abstract:

Reverse Logistics (RL) Process is considered as complex and dynamic network that involves many stakeholders such as: suppliers, manufactures, warehouse, retails, and costumers, this complexity is inherent in such process due to lack of perfect knowledge or conflicting information. Ontologies, on the other hand, can be considered as an approach to overcome the problem of sharing knowledge and communication among the various reverse logistics partners. In this paper, we propose a semantic representation based on hybrid architecture for building the Ontologies in an ascendant way, this method facilitates the semantic reconciliation between the heterogeneous information systems (ICT) that support reverse logistics Processes and product data.

Keywords: Reverse Logistics, information management, heterogeneity, ontologies, semantic web

Procedia PDF Downloads 468
1380 Adaptive Multiple Transforms Hardware Architecture for Versatile Video Coding

Authors: T. Damak, S. Houidi, M. A. Ben Ayed, N. Masmoudi

Abstract:

The Versatile Video Coding standard (VVC) is actually under development by the Joint Video Exploration Team (or JVET). An Adaptive Multiple Transforms (AMT) approach was announced. It is based on different transform modules that provided an efficient coding. However, the AMT solution raises several issues especially regarding the complexity of the selected set of transforms. This can be an important issue, particularly for a future industrial adoption. This paper proposed an efficient hardware implementation of the most used transform in AMT approach: the DCT II. The developed circuit is adapted to different block sizes and can reach a minimum frequency of 192 MHz allowing an optimized execution time.

Keywords: adaptive multiple transforms, AMT, DCT II, hardware, transform, versatile video coding, VVC

Procedia PDF Downloads 123
1379 Simulation and Hardware Implementation of Data Communication Between CAN Controllers for Automotive Applications

Authors: R. M. Kalayappan, N. Kathiravan

Abstract:

In automobile industries, Controller Area Network (CAN) is widely used to reduce the system complexity and inter-task communication. Therefore, this paper proposes the hardware implementation of data frame communication between one controller to other. The CAN data frames and protocols will be explained deeply, here. The data frames are transferred without any collision or corruption. The simulation is made in the KEIL vision software to display the data transfer between transmitter and receiver in CAN. ARM7 micro-controller is used to transfer data’s between the controllers in real time. Data transfer is verified using the CRO.

Keywords: control area network (CAN), automotive electronic control unit, CAN 2.0, industry

Procedia PDF Downloads 373
1378 Omni-Modeler: Dynamic Learning for Pedestrian Redetection

Authors: Michael Karnes, Alper Yilmaz

Abstract:

This paper presents the application of the omni-modeler towards pedestrian redetection. The pedestrian redetection task creates several challenges when applying deep neural networks (DNN) due to the variety of pedestrian appearance with camera position, the variety of environmental conditions, and the specificity required to recognize one pedestrian from another. DNNs require significant training sets and are not easily adapted for changes in class appearances or changes in the set of classes held in its knowledge domain. Pedestrian redetection requires an algorithm that can actively manage its knowledge domain as individuals move in and out of the scene, as well as learn individual appearances from a few frames of a video. The Omni-Modeler is a dynamically learning few-shot visual recognition algorithm developed for tasks with limited training data availability. The Omni-Modeler adapts the knowledge domain of pre-trained deep neural networks to novel concepts with a calculated localized language encoder. The Omni-Modeler knowledge domain is generated by creating a dynamic dictionary of concept definitions, which are directly updatable as new information becomes available. Query images are identified through nearest neighbor comparison to the learned object definitions. The study presented in this paper evaluates its performance in re-identifying individuals as they move through a scene in both single-camera and multi-camera tracking applications. The results demonstrate that the Omni-Modeler shows potential for across-camera view pedestrian redetection and is highly effective for single-camera redetection with a 93% accuracy across 30 individuals using 64 example images for each individual.

Keywords: dynamic learning, few-shot learning, pedestrian redetection, visual recognition

Procedia PDF Downloads 48
1377 The Essence of Culture and Religion in Creating Disaster Resilient Societies through Corporate Social Responsibility

Authors: Repaul Kanji, Rajat Agrawal

Abstract:

In this era where issues like climate change and disasters are the topics of discussion at national and international forums, it is very often that humanity questions the causative role of corporates in such events. It is beyond any doubt that rapid industrialisation and development has taken a toll in the form of climate change and even disasters, in some case. Thus, demanding to fulfill a corporate's responsibilities in the form of rescue and relief in times of disaster, rehabilitation and even mitigation and preparedness to adapt to the oncoming changes is obvious. But how can the responsibilities of the corporates be channelised to ensure all this, i.e., develop a resilient society? More than that, which factors, when emphasised upon, can lead to the holistic development of the society. To answer this query, an extensive literature review was done to identify several enablers like legislations of a nation, the role of brand and reputation, ease of doing Corporate Social Responsibility, mission and vision of an organisation, religion and culture, etc. as a tool for building disaster resilience. A questionnaire survey, interviews with experts and academicians followed by interpretive structural modelling (ISM) were used to construct a multi-hierarchy model depicting the contextual relationship among the identified enablers. The study revealed that culture and religion are the most powerful driver, which affects other enablers either directly or indirectly. Taking cognisance of the fact that an idea of separation between religion and workplace (business) resides subconsciously within the society, the study tries to interpret the outcome of the ISM through the lenses of past researches (The Integrating Box) and explores how it can be leveraged to build a resilient society.

Keywords: corporate social responsibility, interpretive structural modelling, disaster resilience and risk reduction, the integration box (TIB)

Procedia PDF Downloads 184
1376 Turing Pattern in the Oregonator Revisited

Authors: Elragig Aiman, Dreiwi Hanan, Townley Stuart, Elmabrook Idriss

Abstract:

In this paper, we reconsider the analysis of the Oregonator model. We highlight an error in this analysis which leads to an incorrect depiction of the parameter region in which diffusion driven instability is possible. We believe that the cause of the oversight is the complexity of stability analyses based on eigenvalues and the dependence on parameters of matrix minors appearing in stability calculations. We regenerate the parameter space where Turing patterns can be seen, and we use the common Lyapunov function (CLF) approach, which is numerically reliable, to further confirm the dependence of the results on diffusion coefficients intensities.

Keywords: diffusion driven instability, common Lyapunov function (CLF), turing pattern, positive-definite matrix

Procedia PDF Downloads 336
1375 Performance Comparison of Prim’s and Ant Colony Optimization Algorithm to Select Shortest Path in Case of Link Failure

Authors: Rimmy Yadav, Avtar Singh

Abstract:

—Ant Colony Optimization (ACO) is a promising modern approach to the unused combinatorial optimization. Here ACO is applied to finding the shortest during communication link failure. In this paper, the performances of the prim’s and ACO algorithm are made. By comparing the time complexity and program execution time as set of parameters, we demonstrate the pleasant performance of ACO in finding excellent solution to finding shortest path during communication link failure.

Keywords: ant colony optimization, link failure, prim’s algorithm, shortest path

Procedia PDF Downloads 372
1374 Artificial Intelligence Technologies Used in Healthcare: Its Implication on the Healthcare Workforce and Applications in the Diagnosis of Diseases

Authors: Rowanda Daoud Ahmed, Mansoor Abdulhak, Muhammad Azeem Afzal, Sezer Filiz, Usama Ahmad Mughal

Abstract:

This paper discusses important aspects of AI in the healthcare domain. The increase of data in healthcare both in size and complexity, opens more room for artificial intelligence applications. Our focus is to review the main AI methods within the scope of the health care domain. The results of the review show that recommendations for diagnosis and recommendations for treatment, patent engagement, and administrative tasks are the key applications of AI in healthcare. Understanding the potential of AI methods in the domain of healthcare would benefit healthcare practitioners and will improve patient outcomes.

Keywords: AI in healthcare, technologies of AI, neural network, future of AI in healthcare

Procedia PDF Downloads 87
1373 Green Public Procurement in Open Access and Traditional Journals: A Comparative Bibliometric Analysis

Authors: Alonso-Cañadas J., Galán-Valdivieso F., Saraite-Sariene L., García-Tabuyo M., Alonso-Morales N.

Abstract:

Green Public Procurement (GPP) has recently gained attention in the academic and policy arenas since climate change has shown the need to be addressed by both private companies and public entities. Such growing interest motivates this article, aiming to explore the most influential journals, publishers, categories, and topics, as well as the recent trends and future research lines in GPP. Based on the Web of Science database, 578 articles from 2004 to February 2022 devoted to GPP are analyzed using Bibliometrix, an R-tool to perform bibliometric analysis, and Google’s Big Query and Data Studio. This article introduces a variety of findings. First, the most influential journals by far are “Journal of Cleaner Production” and “Sustainability,” differing in that the latter is open access while the former publishes via traditional subscription. This result also occurs regarding the main publishers (Elsevier and MDPI). These features lead us to split the sample into open-access journals and traditional journals to deepen into the similarities and differences between them, confirming that traditional journals exhibit a higher degree of influence in the literature than their open-access counterparts in terms of the number of documents, number of citations and impact (according to the H index). Second, this research also highlights the recent emergence of green-related terms (sustainable, environment) and, parallelly, the increase in categorizing GPP papers in “green” WoS categories, particularly since 2019. Finally, a number of related topics are emerging and will lead the research, such as food security, infrastructures, and implementation barriers of GPP.

Keywords: bibliometric analysis, green public procurement, open access, traditional journals

Procedia PDF Downloads 75
1372 Implementation of Iterative Algorithm for Earthquake Location

Authors: Hussain K. Chaiel

Abstract:

The development in the field of the digital signal processing (DSP) and the microelectronics technology reduces the complexity of the iterative algorithms that need large number of arithmetic operations. Virtex-Field Programmable Gate Arrays (FPGAs) are programmable silicon foundations which offer an important solution for addressing the needs of high performance DSP designer. In this work, Virtex-7 FPGA technology is used to implement an iterative algorithm to estimate the earthquake location. Simulation results show that an implementation based on block RAMB36E1 and DSP48E1 slices of Virtex-7 type reduces the number of cycles of the clock frequency. This enables the algorithm to be used for earthquake prediction.

Keywords: DSP, earthquake, FPGA, iterative algorithm

Procedia PDF Downloads 359
1371 Solving SPDEs by Least Squares Method

Authors: Hassan Manouzi

Abstract:

We present in this paper a useful strategy to solve stochastic partial differential equations (SPDEs) involving stochastic coefficients. Using the Wick-product of higher order and the Wiener-Itˆo chaos expansion, the SPDEs is reformulated as a large system of deterministic partial differential equations. To reduce the computational complexity of this system, we shall use a decomposition-coordination method. To obtain the chaos coefficients in the corresponding deterministic equations, we use a least square formulation. Once this approximation is performed, the statistics of the numerical solution can be easily evaluated.

Keywords: least squares, wick product, SPDEs, finite element, wiener chaos expansion, gradient method

Procedia PDF Downloads 390
1370 Research on Construction of Subject Knowledge Base Based on Literature Knowledge Extraction

Authors: Yumeng Ma, Fang Wang, Jinxia Huang

Abstract:

Researchers put forward higher requirements for efficient acquisition and utilization of domain knowledge in the big data era. As literature is an effective way for researchers to quickly and accurately understand the research situation in their field, the knowledge discovery based on literature has become a new research method. As a tool to organize and manage knowledge in a specific domain, the subject knowledge base can be used to mine and present the knowledge behind the literature to meet the users' personalized needs. This study designs the construction route of the subject knowledge base for specific research problems. Information extraction method based on knowledge engineering is adopted. Firstly, the subject knowledge model is built through the abstraction of the research elements. Then under the guidance of the knowledge model, extraction rules of knowledge points are compiled to analyze, extract and correlate entities, relations, and attributes in literature. Finally, a database platform based on this structured knowledge is developed that can provide a variety of services such as knowledge retrieval, knowledge browsing, knowledge q&a, and visualization correlation. Taking the construction practices in the field of activating blood circulation and removing stasis as an example, this study analyzes how to construct subject knowledge base based on literature knowledge extraction. As the system functional test shows, this subject knowledge base can realize the expected service scenarios such as a quick query of knowledge, related discovery of knowledge and literature, knowledge organization. As this study enables subject knowledge base to help researchers locate and acquire deep domain knowledge quickly and accurately, it provides a transformation mode of knowledge resource construction and personalized precision knowledge services in the data-intensive research environment.

Keywords: knowledge model, literature knowledge extraction, precision knowledge services, subject knowledge base

Procedia PDF Downloads 137
1369 "IS Cybernetics": An Idea to Base the International System Theory upon the General System Theory and Cybernetics

Authors: Petra Suchovska

Abstract:

The spirit of post-modernity remains chaotic and obscure. Geopolitical rivalries raging at the more extreme levels and the ability of intellectual community to explain the entropy of global affairs has been diminishing. The Western-led idea of globalisation imposed upon the world does not seem to bring the bright future for human progress anymore, and its architects lose much of global control, as the strong non-western cultural entities develop new forms of post-modern establishments. The overall growing cultural misunderstanding and mistrust are expressions of political impotence to deal with the inner contradictions within the contemporary phenomenon (capitalism, economic globalisation) that embrace global society. The drivers and effects of global restructuring must be understood in the context of systems and principles reflecting on true complexity of society. The purpose of this paper is to set out some ideas about how cybernetics can contribute to understanding international system structure and analyse possible world futures. “IS Cybernetics” would apply to system thinking and cybernetic principles in IR in order to analyse and handle the complexity of social phenomena from global perspective. “IS cybernetics” would be, for now, the subfield of IR, concerned with applying theories and methodologies from cybernetics and system sciences by offering concepts and tools for addressing problems holistically. It would bring order to the complex relations between disciplines that IR touches upon. One of its tasks would be to map, measure, tackle and find the principles of dynamics and structure of social forces that influence human behaviour and consequently cause political, technological and economic structural reordering, forming and reforming the international system. “IS cyberneticists” task would be to understand the control mechanisms that govern the operation of international society (and the sub-systems in their interconnection) and only then suggest better ways operate these mechanisms on sublevels as cultural, political, technological, religious and other. “IS cybernetics” would also strive to capture the mechanism of social-structural changes in time, which would open space for syntheses between IR and historical sociology. With the cybernetic distinction between first order studies of observed systems and the second order study of observing systems, IS cybernetics would also provide a unifying epistemological and methodological, conceptual framework for multilateralism and multiple modernities theory.

Keywords: cybernetics, historical sociology, international system, systems theory

Procedia PDF Downloads 206
1368 Neologisms and Word-Formation Processes in Board Game Rulebook Corpus: Preliminary Results

Authors: Athanasios Karasimos, Vasiliki Makri

Abstract:

This research focuses on the design and development of the first text Corpus based on Board Game Rulebooks (BGRC) with direct application on the morphological analysis of neologisms and tendencies in word-formation processes. Corpus linguistics is a dynamic field that examines language through the lens of vast collections of texts. These corpora consist of diverse written and spoken materials, ranging from literature and newspapers to transcripts of everyday conversations. By morphologically analyzing these extensive datasets, morphologists can gain valuable insights into how language functions and evolves, as these extensive datasets can reflect the byproducts of inflection, derivation, blending, clipping, compounding, and neology. This entails scrutinizing how words are created, modified, and combined to convey meaning in a corpus of challenging, creative, and straightforward texts that include rules, examples, tutorials, and tips. Board games teach players how to strategize, consider alternatives, and think flexibly, which are critical elements in language learning. Their rulebooks reflect not only their weight (complexity) but also the language properties of each genre and subgenre of these games. Board games are a captivating realm where strategy, competition, and creativity converge. Beyond the excitement of gameplay, board games also spark the art of word creation. Word games, like Scrabble, Codenames, Bananagrams, Wordcraft, Alice in the Wordland, Once uUpona Time, challenge players to construct words from a pool of letters, thus encouraging linguistic ingenuity and vocabulary expansion. These games foster a love for language, motivating players to unearth obscure words and devise clever combinations. On the other hand, the designers and creators produce rulebooks, where they include their joy of discovering the hidden potential of language, igniting the imagination, and playing with the beauty of words, making these games a delightful fusion of linguistic exploration and leisurely amusement. In this research, more than 150 rulebooks in English from all types of modern board games, either language-independent or language-dependent, are used to create the BGRC. A representative sample of each genre (family, party, worker placement, deckbuilding, dice, and chance games, strategy, eurogames, thematic, role-playing, among others) was selected based on the score from BoardGameGeek, the size of the texts and the level of complexity (weight) of the game. A morphological model with morphological networks, multi-word expressions, and word-creation mechanics based on the complexity of the textual structure, difficulty, and board game category will be presented. In enabling the identification of patterns, trends, and variations in word formation and other morphological processes, this research aspires to make avail of this creative yet strict text genre so as to (a) give invaluable insight into morphological creativity and innovation that (re)shape the lexicon of the English language and (b) test morphological theories. Overall, it is shown that corpus linguistics empowers us to explore the intricate tapestry of language, and morphology in particular, revealing its richness, flexibility, and adaptability in the ever-evolving landscape of human expression.

Keywords: board game rulebooks, corpus design, morphological innovations, neologisms, word-formation processes

Procedia PDF Downloads 62
1367 Best Resource Recommendation for a Stochastic Process

Authors: Likewin Thomas, M. V. Manoj Kumar, B. Annappa

Abstract:

The aim of this study was to develop an Artificial Neural Network0 s recommendation model for an online process using the complexity of load, performance, and average servicing time of the resources. Here, the proposed model investigates the resource performance using stochastic gradient decent method for learning ranking function. A probabilistic cost function is implemented to identify the optimal θ values (load) on each resource. Based on this result the recommendation of resource suitable for performing the currently executing task is made. The test result of CoSeLoG project is presented with an accuracy of 72.856%.

Keywords: ADALINE, neural network, gradient decent, process mining, resource behaviour, polynomial regression model

Procedia PDF Downloads 360
1366 A Further Insight to Foaming in Anaerobic Digester

Authors: Ifeyinwa Rita Kanu, Thomas Aspray, Adebayo J. Adeloye

Abstract:

As a result of the ambiguity and complexity surrounding anaerobic digester foaming, efforts have been made by various researchers to understand the process of anaerobic digester foaming so as to proffer a solution that can be universally applied rather than site specific. All attempts ranging from experimental analysis to comparative review of other process has been futile at explaining explicitly the conditions and process of foaming in anaerobic digester. Studying the available knowledge on foam formation and relating it to anaerobic digester process and operating condition, this study presents a succinct and enhanced understanding of foaming in anaerobic digesters as well as introducing a simple and novel method to identify the onset of anaerobic digester foaming based on analysis of historical data from a field scale system.

Keywords: anaerobic digester, foaming, biogas, surfactant, wastewater

Procedia PDF Downloads 422
1365 Risk Allocation in Public-Private Partnership (PPP) Projects for Wastewater Treatment Plants

Authors: Samuel Capintero, Ole H. Petersen

Abstract:

This paper examines the utilization of public-private partnerships for the building and operation of wastewater treatment plants. Our research focuses on risk allocation in this kind of projects. Our analysis builds on more than hundred wastewater treatment plants built and operated through PPP projects in Aragon (Spain). The paper illustrates the consequences of an inadequate management of construction risk and an unsuitable transfer of demand risk in wastewater treatment plants. It also shows that the involvement of many public bodies at local, regional and national level further increases the complexity of this kind of projects and make time delays more likely.

Keywords: wastewater, treatment plants, PPP, construction

Procedia PDF Downloads 617
1364 Coaching for Lecturers at German Universities: An Inventory Based on a Qualitative Interview Study

Authors: Freya Willicks

Abstract:

The society of the 21st century is characterized by dynamic and complexity, developments that also shape universities and university life. The Bologna reform, for example, has led to restructuring at many European universities. Today's university teachers, therefore, have to meet many expectations: Their tasks include not only teaching but also the general improvement of the quality of teaching, good research, the management of various projects or the development of their own personal skills. This requires a high degree of flexibility and openness to change. The resulting pressure can often lead to exhaustion. Coaching can be a way for university teachers to cope with these pressures because it gives them the opportunity to discuss stressful situations with a coach and self-reflect on them. As a result, more and more universities in Europe offer to coach to their teachers. An analysis of the services provided at universities in Germany, however, quickly reveals an immense disagreement with regard to the understanding of ‘coaching’. A variety of terms is used, such as coaching, counselling or supervision. In addition, each university defines its offer individually, from process-oriented consulting to expert consulting, from group training to individual coaching. The biographic backgrounds of those who coach are also very divergent, both external and internal coaches can be suitable. These findings lead to the following questions: Which structural characteristics for coaching at universities have been proven successful? What competencies should a good coach for university lecturers have? In order to answer these questions, a qualitative study was carried out. In a first step, qualitative semi-structured interviews (N = 14) were conducted, on the one hand with coaches for university teachers and on the other hand with university teachers who have been coached. In a second step, the interviews were transcribed and analyzed using Mayring's qualitative content analysis. The study shows how great the potential of coaching can be for university teachers, who otherwise have little opportunity to talk about their teaching in a private setting. According to the study, the coach should neither be a colleague nor a superior of the coachee but should take an independent perspective, as this is the only way for the coachee to openly reflect on himself/herself. In addition, the coach should be familiar with the university system, i.e., be an academic himself/herself. Otherwise, he/she cannot fully understand the complexity of the teaching situation and the role expectations. However, internal coaches do not necessarily have much coaching experience or explicit coaching competencies. They often come from the university's own didactics department, are experts in didactics, but do not necessarily have a certified coaching education. Therefore, it is important to develop structures and guidelines for internal coaches to support their coaching. In further analysis, such guidelines will be developed on the basis of these interviews.

Keywords: coaching, university coaching, university didactics, qualitative interviews

Procedia PDF Downloads 93
1363 Residual Life Estimation of K-out-of-N Cold Standby System

Authors: Qian Zhao, Shi-Qi Liu, Bo Guo, Zhi-Jun Cheng, Xiao-Yue Wu

Abstract:

Cold standby redundancy is considered to be an effective mechanism for improving system reliability and is widely used in industrial engineering. However, because of the complexity of the reliability structure, there is little literature studying on the residual life of cold standby system consisting of complex components. In this paper, a simulation method is presented to predict the residual life of k-out-of-n cold standby system. In practical cases, failure information of a system is either unknown, partly unknown or completely known. Our proposed method is designed to deal with the three scenarios, respectively. Differences between the procedures are analyzed. Finally, numerical examples are used to validate the proposed simulation method.

Keywords: cold standby system, k-out-of-n, residual life, simulation sampling

Procedia PDF Downloads 375
1362 Spectral Efficiency Improvement in 5G Systems by Polyphase Decomposition

Authors: Wilson Enríquez, Daniel Cardenas

Abstract:

This article proposes a filter bank format combined with the mathematical tool called polyphase decomposition and the discrete Fourier transform (DFT) with the purpose of improving the performance of the fifth-generation communication systems (5G). We started with a review of the literature and the study of the filter bank theory and its combination with DFT in order to improve the performance of wireless communications since it reduces the computational complexity of these communication systems. With the proposed technique, several experiments were carried out in order to evaluate the structures in 5G systems. Finally, the results are presented in graphical form in terms of bit error rate against the ratio bit energy/noise power spectral density (BER vs. Eb / No).

Keywords: multi-carrier system (5G), filter bank, polyphase decomposition, FIR equalizer

Procedia PDF Downloads 173
1361 Optimisation of the Input Layer Structure for Feedforward Narx Neural Networks

Authors: Zongyan Li, Matt Best

Abstract:

This paper presents an optimization method for reducing the number of input channels and the complexity of the feed-forward NARX neural network (NN) without compromising the accuracy of the NN model. By utilizing the correlation analysis method, the most significant regressors are selected to form the input layer of the NN structure. An application of vehicle dynamic model identification is also presented in this paper to demonstrate the optimization technique and the optimal input layer structure and the optimal number of neurons for the neural network is investigated.

Keywords: correlation analysis, F-ratio, levenberg-marquardt, MSE, NARX, neural network, optimisation

Procedia PDF Downloads 346