Search results for: reflective approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13659

Search results for: reflective approach

9639 Dual Band Shared Aperture Antenna for 5G Communications

Authors: Zunnurain Ahmad

Abstract:

This work presents design of a dual band antenna for the 5G communications in the millimeter wave band. As opposed to conventional patch antennas which are limited to single narrow band operation a shared aperture concept is utilized for this antenna. The patch aperture is coupled through two rectangular slots etched on a thin printed circuit board (100μm). The patch is elevated in air thus avoiding excitation of surface waves and minimizing dielectric losses at millimeter wave frequencies. With this approach the radiator can cover lower band of 28 GHz and upper band of 37/ 39 GHz dedicated for the fifth generation communications. The simulated radiation efficiency of the antenna stays above 90%.

Keywords: antenna, millimeter wave, 5G, 3D

Procedia PDF Downloads 37
9638 Increasing Access to Upper Limb Reconstruction in Cervical Spinal Cord Injury

Authors: Michelle Jennett, Jana Dengler, Maytal Perlman

Abstract:

Background: Cervical spinal cord injury (SCI) is a devastating event that results in upper limb paralysis, loss of independence, and disability. People living with cervical SCI have identified improvement of upper limb function as a top priority. Nerve and tendon transfer surgery has successfully restored upper limb function in cervical SCI but is not universally used or available to all eligible individuals. This exploratory mixed-methods study used an implementation science approach to better understand these factors that influence access to upper limb reconstruction in the Canadian context and design an intervention to increase access to care. Methods: Data from the Canadian Institute for Health Information’s Discharge Abstracts Database (CIHI-DAD) and the National Ambulatory Care Reporting System (NACRS) were used to determine the annual rate of nerve transfer and tendon transfer surgeries performed in cervical SCI in Canada over the last 15 years. Semi-structured interviews informed by the consolidated framework for implementation research (CFIR) were used to explore Ontario healthcare provider knowledge and practices around upper limb reconstruction. An inductive, iterative constant comparative process involving descriptive and interpretive analyses was used to identify themes that emerged from the data. Results: Healthcare providers (n = 10 upper extremity surgeons, n = 10 SCI physiatrists, n = 12 physical and occupational therapists working with individuals with SCI) were interviewed about their knowledge and perceptions of upper limb reconstruction and their current practices and discussions around upper limb reconstruction. Data analysis is currently underway and will be presented. Regional variation in rates of upper limb reconstruction and trends over time are also currently being analyzed. Conclusions: Utilization of nerve and tendon transfer surgery to improve upper limb reconstruction in Canada remains low. There are a complex array of interrelated individual-, provider- and system-level barriers that prevent individuals with cervical SCI from accessing upper limb reconstruction. In order to offer equitable access to care, a multi-modal approach addressing current barriers is required.

Keywords: cervical spinal cord injury, nerve and tendon transfer surgery, spinal cord injury, upper extremity reconstruction

Procedia PDF Downloads 85
9637 The Use of Random Set Method in Reliability Analysis of Deep Excavations

Authors: Arefeh Arabaninezhad, Ali Fakher

Abstract:

Since the deterministic analysis methods fail to take system uncertainties into account, probabilistic and non-probabilistic methods are suggested. Geotechnical analyses are used to determine the stress and deformation caused by construction; accordingly, many input variables which depend on ground behavior are required for geotechnical analyses. The Random Set approach is an applicable reliability analysis method when comprehensive sources of information are not available. Using Random Set method, with relatively small number of simulations compared to fully probabilistic methods, smooth extremes on system responses are obtained. Therefore random set approach has been proposed for reliability analysis in geotechnical problems. In the present study, the application of random set method in reliability analysis of deep excavations is investigated through three deep excavation projects which were monitored during the excavating process. A finite element code is utilized for numerical modeling. Two expected ranges, from different sources of information, are established for each input variable, and a specific probability assignment is defined for each range. To determine the most influential input variables and subsequently reducing the number of required finite element calculations, sensitivity analysis is carried out. Input data for finite element model are obtained by combining the upper and lower bounds of the input variables. The relevant probability share of each finite element calculation is determined considering the probability assigned to input variables present in these combinations. Horizontal displacement of the top point of excavation is considered as the main response of the system. The result of reliability analysis for each intended deep excavation is presented by constructing the Belief and Plausibility distribution function (i.e. lower and upper bounds) of system response obtained from deterministic finite element calculations. To evaluate the quality of input variables as well as applied reliability analysis method, the range of displacements extracted from models has been compared to the in situ measurements and good agreement is observed. The comparison also showed that Random Set Finite Element Method applies to estimate the horizontal displacement of the top point of deep excavation. Finally, the probability of failure or unsatisfactory performance of the system is evaluated by comparing the threshold displacement with reliability analysis results.

Keywords: deep excavation, random set finite element method, reliability analysis, uncertainty

Procedia PDF Downloads 254
9636 Sustainable and Responsible Mining - Lundin Mining’s Subsidiary in Portugal, Sociedade Mineira de Neves-Corvo Case

Authors: Jose Daniel Braga Alves, Joaquim Gois, Alexandre Leite

Abstract:

This abstract presents the responsible and sustainable mining case study of a Portuguese mine operation, highlighting how mine exploitation can sustainably exist in balance with the environment, aligned with all stakeholders. The mining operation is remotely located in a United Nations (UN) biodiversity reserve, away from major industrial centers or logistical ports, and presents an interesting investigation to assess the balanced mine operation in alignment with all key stakeholders, which presents unique opportunities as well as challenges. Based on the sustainable mining framework, it is intended to detail examples of best practices from Sociedade Mineira de Neves-Corvo (SOMINCOR), demonstrating social acceptance by the local community, health, and safety at work, reduction of environmental impacts and management of mining waste, which directly influence the acceptance and recognition of a sustainable operation. The case study aims to present the SOMINCOR approach to sustainable mining, focusing on social responsibility, considering materials provided by Lundin Mining Corporation (LMC) and SOMINCOR and the socially responsible approach of the mining operations., referencing related international guidelines, UN Sustainable Development Goals. The researchers reviewed LMC's annual Sustainability Reports (2019, 2020 and 2021) and updated information regarding material topics of the most significant interest to internal and external stakeholders. These material topics formed the basis of the corporation-wide sustainability strategy. LMC's Responsible Mining Policy (RMP) was reviewed, focusing on the commitment that guides the approach to responsible operation and management of the Company's business. Social performance, compliance, environmental management, governance, human rights, and economic contribution are principles of the RMP. The Human Rights Risk Impact Assessment (HRRIA), based on frameworks including UN Guiding Principles (UNGP), Voluntary Principles on Security and Human Rights, and a community engagement program implemented (SLO index), was part of this research. The program consists of ongoing surveys and perceptions studies using behavioural science insights, data from which was not available within the timeframe of completing this research. LMC stakeholder engagement standards and grievance mechanisms were also reviewed. Stakeholder engagement and the community's perception are key to this operation to ensure social license to operate (SLO). Preliminary surveys with local communities provided input data for the local development strategy. After the implementation of several initiatives, subsequent surveys were performed to assess acceptance and trust from the local communities and changes to the SLO index. SOMINCOR's operation contributes to 12 out of 17 sustainable development goals. From the assessed and available data, local communities and social engagement are priorities to SOMINCOR. Experience to date shows that the continual engagement with local communities and the grievance mechanisms in place are respected and followed for all concerns presented by any stakeholder. It can be concluded that this underground mine in Portugal complies with applicable regulations and goes beyond them with regard to sustainable development and engagement with key stakeholders.

Keywords: sustainable mining, development goals, portuguese mining, zinc copper

Procedia PDF Downloads 65
9635 Forced-Choice Measurement Models of Behavioural, Social, and Emotional Skills: Theory, Research, and Development

Authors: Richard Roberts, Anna Kravtcova

Abstract:

Introduction: The realisation that personality can change over the course of a lifetime has led to a new companion model to the Big Five, the behavioural, emotional, and social skills approach (BESSA). BESSA hypothesizes that this set of skills represents how the individual is thinking, feeling, and behaving when the situation calls for it, as opposed to traits, which represent how someone tends to think, feel, and behave averaged across situations. The five major skill domains share parallels with the Big Five Factor (BFF) model creativity and innovation (openness), self-management (conscientiousness), social engagement (extraversion), cooperation (agreeableness), and emotional resilience (emotional stability) skills. We point to noteworthy limitations in the current operationalisation of BESSA skills (i.e., via Likert-type items) and offer up a different measurement approach: forced choice. Method: In this forced-choice paradigm, individuals were given three skill items (e.g., managing my time) and asked to select one response they believed they were “worst at” and “best at”. The Thurstonian IRT models allow these to be placed on a normative scale. Two multivariate studies (N = 1178) were conducted with a 22-item forced-choice version of the BESSA, a published measure of the BFF, and various criteria. Findings: Confirmatory factor analysis of the forced-choice assessment showed acceptable model fit (RMSEA<0.06), while reliability estimates were reasonable (around 0.70 for each construct). Convergent validity evidence was as predicted (correlations between 0.40 and 0.60 for corresponding BFF and BESSA constructs). Notable was the extent the forced-choice BESSA assessment improved upon test-criterion relationships over and above the BFF. For example, typical regression models find BFF personality accounting for 25% of the variance in life satisfaction scores; both studies showed incremental gains over the BFF exceeding 6% (i.e., BFF and BESSA together accounted for over 31% of the variance in both studies). Discussion: Forced-choice measurement models offer up the promise of creating equated test forms that may unequivocally measure skill gains and are less prone to fakability and reference bias effects. Implications for practitioners are discussed, especially those interested in selection, succession planning, and training and development. We also discuss how the forced choice method can be applied to other constructs like emotional immunity, cross-cultural competence, and self-estimates of cognitive ability.

Keywords: Big Five, forced-choice method, BFF, methods of measurements

Procedia PDF Downloads 80
9634 Estimating the Ladder Angle and the Camera Position From a 2D Photograph Based on Applications of Projective Geometry and Matrix Analysis

Authors: Inigo Beckett

Abstract:

In forensic investigations, it is often the case that the most potentially useful recorded evidence derives from coincidental imagery, recorded immediately before or during an incident, and that during the incident (e.g. a ‘failure’ or fire event), the evidence is changed or destroyed. To an image analysis expert involved in photogrammetric analysis for Civil or Criminal Proceedings, traditional computer vision methods involving calibrated cameras is often not appropriate because image metadata cannot be relied upon. This paper presents an approach for resolving this problem, considering in particular and by way of a case study, the angle of a simple ladder shown in a photograph. The UK Health and Safety Executive (HSE) guidance document published in 2014 (INDG455) advises that a leaning ladder should be erected at 75 degrees to the horizontal axis. Personal injury cases can arise in the construction industry because a ladder is too steep or too shallow. Ad-hoc photographs of such ladders in their incident position provide a basis for analysis of their angle. This paper presents a direct approach for ascertaining the position of the camera and the angle of the ladder simultaneously from the photograph(s) by way of a workflow that encompasses a novel application of projective geometry and matrix analysis. Mathematical analysis shows that for a given pixel ratio of directly measured collinear points (i.e. features that lie on the same line segment) from the 2D digital photograph with respect to a given viewing point, we can constrain the 3D camera position to a surface of a sphere in the scene. Depending on what we know about the ladder, we can enforce another independent constraint on the possible camera positions which enables us to constrain the possible positions even further. Experiments were conducted using synthetic and real-world data. The synthetic data modeled a vertical plane with a ladder on a horizontally flat plane resting against a vertical wall. The real-world data was captured using an Apple iPhone 13 Pro and 3D laser scan survey data whereby a ladder was placed in a known location and angle to the vertical axis. For each case, we calculated camera positions and the ladder angles using this method and cross-compared them against their respective ‘true’ values.

Keywords: image analysis, projective geometry, homography, photogrammetry, ladders, Forensics, Mathematical modeling, planar geometry, matrix analysis, collinear, cameras, photographs

Procedia PDF Downloads 32
9633 Application of Lattice Boltzmann Method to Different Boundary Conditions in a Two Dimensional Enclosure

Authors: Jean Yves Trepanier, Sami Ammar, Sagnik Banik

Abstract:

Lattice Boltzmann Method has been advantageous in simulating complex boundary conditions and solving for fluid flow parameters by streaming and collision processes. This paper includes the study of three different test cases in a confined domain using the method of the Lattice Boltzmann model. 1. An SRT (Single Relaxation Time) approach in the Lattice Boltzmann model is used to simulate Lid Driven Cavity flow for different Reynolds Number (100, 400 and 1000) with a domain aspect ratio of 1, i.e., square cavity. A moment-based boundary condition is used for more accurate results. 2. A Thermal Lattice BGK (Bhatnagar-Gross-Krook) Model is developed for the Rayleigh Benard convection for both test cases - Horizontal and Vertical Temperature difference, considered separately for a Boussinesq incompressible fluid. The Rayleigh number is varied for both the test cases (10^3 ≤ Ra ≤ 10^6) keeping the Prandtl number at 0.71. A stability criteria with a precise forcing scheme is used for a greater level of accuracy. 3. The phase change problem governed by the heat-conduction equation is studied using the enthalpy based Lattice Boltzmann Model with a single iteration for each time step, thus reducing the computational time. A double distribution function approach with D2Q9 (density) model and D2Q5 (temperature) model are used for two different test cases-the conduction dominated melting and the convection dominated melting. The solidification process is also simulated using the enthalpy based method with a single distribution function using the D2Q5 model to provide a better understanding of the heat transport phenomenon. The domain for the test cases has an aspect ratio of 2 with some exceptions for a square cavity. An approximate velocity scale is chosen to ensure that the simulations are within the incompressible regime. Different parameters like velocities, temperature, Nusselt number, etc. are calculated for a comparative study with the existing works of literature. The simulated results demonstrate excellent agreement with the existing benchmark solution within an error limit of ± 0.05 implicates the viability of this method for complex fluid flow problems.

Keywords: BGK, Nusselt, Prandtl, Rayleigh, SRT

Procedia PDF Downloads 116
9632 Workflow Based Inspection of Geometrical Adaptability from 3D CAD Models Considering Production Requirements

Authors: Tobias Huwer, Thomas Bobek, Gunter Spöcker

Abstract:

Driving forces for enhancements in production are trends like digitalization and individualized production. Currently, such developments are restricted to assembly parts. Thus, complex freeform surfaces are not addressed in this context. The need for efficient use of resources and near-net-shape production will require individualized production of complex shaped workpieces. Due to variations between nominal model and actual geometry, this can lead to changes in operations in Computer-aided process planning (CAPP) to make CAPP manageable for an adaptive serial production. In this context, 3D CAD data can be a key to realizing that objective. Along with developments in the geometrical adaptation, a preceding inspection method based on CAD data is required to support the process planner by finding objective criteria to make decisions about the adaptive manufacturability of workpieces. Nowadays, this kind of decisions is depending on the experience-based knowledge of humans (e.g. process planners) and results in subjective decisions – leading to a variability of workpiece quality and potential failure in production. In this paper, we present an automatic part inspection method, based on design and measurement data, which evaluates actual geometries of single workpiece preforms. The aim is to automatically determine the suitability of the current shape for further machining, and to provide a basis for an objective decision about subsequent adaptive manufacturability. The proposed method is realized by a workflow-based approach, keeping in mind the requirements of industrial applications. Workflows are a well-known design method of standardized processes. Especially in applications like aerospace industry standardization and certification of processes are an important aspect. Function blocks, providing a standardized, event-driven abstraction to algorithms and data exchange, will be used for modeling and execution of inspection workflows. Each analysis step of the inspection, such as positioning of measurement data or checking of geometrical criteria, will be carried out by function blocks. One advantage of this approach is its flexibility to design workflows and to adapt algorithms specific to the application domain. In general, within the specified tolerance range it will be checked if a geometrical adaption is possible. The development of particular function blocks is predicated on workpiece specific information e.g. design data. Furthermore, for different product lifecycle phases, appropriate logics and decision criteria have to be considered. For example, tolerances for geometric deviations are different in type and size for new-part production compared to repair processes. In addition to function blocks, appropriate referencing systems are important. They need to support exact determination of position and orientation of the actual geometries to provide a basis for precise analysis. The presented approach provides an inspection methodology for adaptive and part-individual process chains. The analysis of each workpiece results in an inspection protocol and an objective decision about further manufacturability. A representative application domain is the product lifecycle of turbine blades containing a new-part production and a maintenance process. In both cases, a geometrical adaptation is required to calculate individual production data. In contrast to existing approaches, the proposed initial inspection method provides information to decide between different potential adaptive machining processes.

Keywords: adaptive, CAx, function blocks, turbomachinery

Procedia PDF Downloads 286
9631 A Risk Management Approach for Nigeria Manufacturing Industries

Authors: Olaniyi O. Omoyajowo

Abstract:

To be successful in today’s competitive global environment, manufacturing industry must be able to respond quickly to changes in technology. These changes in technology introduce new risks and hazards. The management of risk/hazard in a manufacturing process recommends method through which the success rate of an organization can be increased. Thus, there is a continual need for manufacturing industries to invest significant amount of resources in risk management, which in turn optimizes the production output and profitability of any manufacturing industry (if implemented properly). To help improve the existing risk prevention and mitigation practices in Small and Medium Enterprise (SME) in Nigeria Manufacturing Industries (NMI), the researcher embarks on this research to develop a systematic Risk Management process.

Keywords: manufacturing management, risk, risk management, SMEs

Procedia PDF Downloads 380
9630 Between the ‘Principle of Hope’ and ‘Spiritual Booze’: An Analysis of Religious Themes in the Language Used by the Russian Marxists

Authors: George Bocean

Abstract:

In the mainstream academic spheres of thought, there is a tendency to associate the writings of Russian Marxists as being constantly against the practice of religion itself. Such arguments mainly stem from how the attitude of the Russian Marxists, specifically the Bolsheviks, towards the concept of religion supposedly originates from its own Marxist ideology. Although Marxism is critical of religion as an institution, the approach that Marxism would have on the question of religion is not as clear. Such aspect is specifically observed in the use of language of major leading Russian Marxist figures, such as Lenin and Trotsky, throughout the early 20th century, where the use of religious metaphors was widely used in their philosophical writings and speeches, as well as in propaganda posters of general left-wing movements in Russia as a whole. The methodology of the research will consist of a sociolinguistic and sociology of language approach within a sociohistorical framework of late Tsarist and early Soviet Russia, 1905-1926. The purpose of such approaches are not simply to point out the religious metaphors used in the writings and speeches of Marxists in Russia, but rather in order to analyse how the use of such metaphors represent an important socio-political connection with the context of Russia at the time. In other words, the use of religious metaphors was not only more akin to Russian culture at the time, but this also resonated and was more familiar with the conditions of the working class and peasantry. An example in this study can be observed in the writings of Lenin, where the theme of chudo (miracle) is often mentioned in his writings, and such a word is commonly associated with an idealist philosophy rather than a materialist one, which represents a common theme in Russian culture in regards to the principle of hope for a better life. A further and even more obvious example is Trotsky’s writings about how the revolution of 1905 “would be revived”, which not only resonates with the theme of resurrection, but also prophesises the “second coming” of a future revolution. Such metaphors are important in the writings of such authors, as they simultaneously contain Marxist ideas, as well as religious themes. In doing this research, this paper will demonstrate two aspects. Firstly, the paper will analyse the use of the metaphors by Russian Marxists as a whole in regards to a socio-political and ideological perspectives akin to those of Marxism. Secondly, it will also demonstrate the role that such metaphors have in regards to their impact on the left-wing movements within Russia itself, as well as their relation to the working class and peasantry of Russia within the historical context.

Keywords: language and politics, Marxism, Russian history, social history, sociology of language

Procedia PDF Downloads 124
9629 Eating Constitutes Human Dignity: A Metaphysical Anthropology Perspective

Authors: Sri Poedjiastoeti

Abstract:

One of the traits of living beings is eating. As the living beings, people must provide their life by taking material. They must assimilate for themselves with substances. They grow and develop themselves by changing what they eat and digest into their own substance. This happened in the so-called eating. This article aims to analyze distinction between human beings and other infrahumans when facing and eating food. It uses the analytical description with metaphysical anthropology approach. As a result, to give the expression that eating is not simply to put food in mouth, chew and swallow it. Eating constitutes a sacred ceremonial if it is done in accordance with human dignity. They face food with distance and moderation as well as civilize or make their behaviour better for it. Accordingly, they are being to be human.

Keywords: human beings, behaviour, eating, dignity

Procedia PDF Downloads 258
9628 Enhancing Inhibition on Phytopathogens by Complex Using Biogas Slurry

Authors: Fang-Bo Yu, Li-Bo Guan, Sheng-Dao Shan

Abstract:

Biogas slurry was mixed with six commercial fungicides and screening against 11 phytopathogens was carried out. Results showed that inhibition of biogas slurry was different for the test strains and no significant difference between treatments of Didymella bryoniae, Fusarium oxysporum f. sp. vasinfectum, Aspergillus niger, Rhizoctonia cerealis, F. graminearum and Septoria tritici was observed. However, significant differences were found among Penicillium sp., Botrytis cinerea, Alternaria sonali, F. oxysporum F. sp. melonis and Sclerotinia sclerotiorum. The approach described here presents a promising alternative to current manipulation although some issues still need further examination. This study could contribute to the development of sustainable agriculture and better utilization of biogas slurry.

Keywords: anaerobic digestion, biogas slurry, phytopathogen, sustainable agriculture

Procedia PDF Downloads 312
9627 Your First Step to Understanding Research Ethics: Psychoneurolinguistic Approach

Authors: Sadeq Al Yaari, Ayman Al Yaari, Adham Al Yaari, Montaha Al Yaari, Aayah Al Yaari, Sajedah Al Yaari

Abstract:

Objective: This research aims at investigating the research ethics in the field of science. Method: It is an exploratory research wherein the researchers attempted to cover the phenomenon at hand from all specialists’ viewpoints. Results Discussion is based upon the findings resulted from the analysis the researcher undertook. Concerning the results’ prediction, the researcher needs first to seek highly qualified people in the field of research as well as in the field of statistics who share the philosophy of the research. Then s/he should make sure that s/he is adequately trained in the specific techniques, methods and statically programs that are used at the study. S/he should also believe in continually analysis for the data in the most current methods.

Keywords: research ethics, legal, rights, psychoneurolinguistics

Procedia PDF Downloads 21
9626 The Applicability of General Catholic Canon Law during the Ongoing Migration Crisis in Hungary

Authors: Lorand Ujhazi

Abstract:

The vast majority of existing canonical studies about migration are focused on examining the general pastoral and legal regulations of the Catholic Church. The weakness of this approach is that it ignores a number of important factors; like the financial, legal and personal circumstances of a particular church or the canonical position of certain organizations which actually look after the immigrants. This paper is a case study, which analyses the current and historical migration related policies and activities of the Catholic Church in Hungary. To achieve this goal the study uses canon law, historical publications, various instructions and communications issued by church superiors, Hungarian and foreign media reports and the relevant Hungarian legislation. The paper first examines how the Hungarian Catholic Church assisted migrants like Armenians fleeing from the Ottoman Empire, Poles escaping during the Second World War, East German and Romanian citizens in the 1980s and refugees from the former Yugoslavia in the 1990s. These events underline the importance of past historical experience in the development of contemporary pastoral and humanitarian policy of the Catholic Church in Hungary. Then the paper turns to the events of the ongoing crisis by describing the unique challenges faced by churches in transit countries like Hungary. Then the research contrasts these findings with the typical responsibilities of churches in countries which are popular destinations for immigrants. The next part of the case study focuses on the changes to the pre-crisis legal and canonical framework which influenced the actions of hierarchical and charity organizations in Hungary. Afterwards, the paper illustrates the dangers of operating in an unclear legal environment, where some charitable activities of the church like a fundraising campaign may be interpreted as a national security risk by state authorities. Then the paper presents the reactions of Hungarian academics to the current migration crisis and finally it offers some proposals how to improve parts of Canon Law which govern immigration. The conclusion of the paper is that during the formulation of the central refugee policy of the Catholic Church decision makers must take into consideration the peculiar circumstances of its particular churches. This approach may prevent disharmony between the existing central regulations, the policy of the Vatican and the operations of the local church organizations.

Keywords: canon law, Catholic Church, civil law, Hungary, immigration, national security

Procedia PDF Downloads 294
9625 Multi-Channel Information Fusion in C-OTDR Monitoring Systems: Various Approaches to Classify of Targeted Events

Authors: Andrey V. Timofeev

Abstract:

The paper presents new results concerning selection of optimal information fusion formula for ensembles of C-OTDR channels. The goal of information fusion is to create an integral classificator designed for effective classification of seismoacoustic target events. The LPBoost (LP-β and LP-B variants), the Multiple Kernel Learning, and Weighing of Inversely as Lipschitz Constants (WILC) approaches were compared. The WILC is a brand new approach to optimal fusion of Lipschitz Classifiers Ensembles. Results of practical usage are presented.

Keywords: Lipschitz Classifier, classifiers ensembles, LPBoost, C-OTDR systems

Procedia PDF Downloads 444
9624 An Interactive User-Oriented Approach to Optimizing Public Space Lighting

Authors: Tamar Trop, Boris Portnov

Abstract:

Public Space Lighting (PSL) of outdoor urban areas promotes comfort, defines spaces and neighborhood identities, enhances perceived safety and security, and contributes to residential satisfaction and wellbeing. However, if excessive or misdirected, PSL leads to unnecessary energy waste and increased greenhouse gas emissions, poses a non-negligible threat to the nocturnal environment, and may become a potential health hazard. At present, PSL is designed according to international, regional, and national standards, which consolidate best practice. Yet, knowledge regarding the optimal light characteristics needed for creating a perception of personal comfort and safety in densely populated residential areas, and the factors associated with this perception, is still scarce. The presented study suggests a paradigm shift in designing PSL towards a user-centered approach, which incorporates pedestrians' perspectives into the process. The study is an ongoing joint research project between China and Israel Ministries of Science and Technology. Its main objectives are to reveal inhabitants' perceptions of and preferences for PSL in different densely populated neighborhoods in China and Israel, and to develop a model that links instrumentally measured parameters of PSL (e.g., intensity, spectra and glare) with its perceived comfort and quality, while controlling for three groups of attributes: locational, temporal, and individual. To investigate measured and perceived PSL, the study employed various research methods and data collection tools, developed a location-based mobile application, and used multiple data sources, such as satellite multi-spectral night-time light imagery, census statistics, and detailed planning schemes. One of the study’s preliminary findings is that higher sense of safety in the investigated neighborhoods is not associated with higher levels of light intensity. This implies potential for energy saving in brightly illuminated residential areas. Study findings might contribute to the design of a smart and adaptive PSL strategy that enhances pedestrians’ perceived safety and comfort while reducing light pollution and energy consumption.

Keywords: energy efficiency, light pollution, public space lighting, PSL, safety perceptions

Procedia PDF Downloads 117
9623 The Strategic Importance of Technology in the International Production: Beyond the Global Value Chains Approach

Authors: Marcelo Pereira Introini

Abstract:

The global value chains (GVC) approach contributes to a better understanding of the international production organization amid globalization’s second unbundling from the 1970s on. Mainly due to the tools that help to understand the importance of critical competences, technological capabilities, and functions performed by each player, GVC research flourished in recent years, rooted in discussing the possibilities of integration and repositioning along regional and global value chains. Regarding this context, part of the literature endorsed a more optimistic view that engaging in fragmented production networks could represent learning opportunities for developing countries’ firms, since the relationship with transnational corporations could allow them build skills and competences. Increasing recognition that GVCs are based on asymmetric power relations provided another sight about benefits, costs, and development possibilities though. Once leading companies tend to restrict the replication of their technologies and capabilities by their suppliers, alternative strategies beyond the functional specialization, seen as a way to integrate value chains, began to be broadly highlighted. This paper organizes a coherent narrative about the shortcomings of the GVC analytical framework, while recognizing its multidimensional contributions and recent developments. We adopt two different and complementary perspectives to explore the idea of integration in the international production. On one hand, we emphasize obstacles beyond production components, analyzing the role played by intangible assets and intellectual property regimes. On the other hand, we consider the importance of domestic production and innovation systems for technological development. In order to provide a deeper understanding of the restrictions on technological learning of developing countries’ firms, we firstly build from the notion of intellectual monopoly to analyze how flagship companies can prevent subordinated firms from improving their positions in fragmented production networks. Based on intellectual property protection regimes we discuss the increasing asymmetries between these players and the decreasing access of part of them to strategic intangible assets. Second, we debate the role of productive-technological ecosystems and of interactive and systemic technological development processes, as concepts of the Innovation Systems approach. Supporting the idea that not only endogenous advantages are important for international competition of developing countries’ firms, but also that the building of these advantages itself can be a source of technological learning, we focus on local efforts as a crucial element, which is not replaceable for technology imported from abroad. Finally, the paper contributes to the discussion about technological development as a two-dimensional dynamic. If GVC analysis tends to underline a company-based perspective, stressing the learning opportunities associated to GVC integration, historical involvement of national States brings up the debate about technology as a central aspect of interstate disputes. In this sense, technology is seen as part of military modernization before being also used in civil contexts, what presupposes its role for national security and productive autonomy strategies. From this outlook, it is important to consider it as an asset that, incorporated in sophisticated machinery, can be the target of state policies besides the protection provided by intellectual property regimes, such as in export controls and inward-investment restrictions.

Keywords: global value chains, innovation systems, intellectual monopoly, technological development

Procedia PDF Downloads 66
9622 Structuring Highly Iterative Product Development Projects by Using Agile-Indicators

Authors: Guenther Schuh, Michael Riesener, Frederic Diels

Abstract:

Nowadays, manufacturing companies are faced with the challenge of meeting heterogeneous customer requirements in short product life cycles with a variety of product functions. So far, some of the functional requirements remain unknown until late stages of the product development. A way to handle these uncertainties is the highly iterative product development (HIP) approach. By structuring the development project as a highly iterative process, this method provides customer oriented and marketable products. There are first approaches for combined, hybrid models comprising deterministic-normative methods like the Stage-Gate process and empirical-adaptive development methods like SCRUM on a project management level. However, almost unconsidered is the question, which development scopes can preferably be realized with either empirical-adaptive or deterministic-normative approaches. In this context, a development scope constitutes a self-contained section of the overall development objective. Therefore, this paper focuses on a methodology that deals with the uncertainty of requirements within the early development stages and the corresponding selection of the most appropriate development approach. For this purpose, internal influencing factors like a company’s technology ability, the prototype manufacturability and the potential solution space as well as external factors like the market accuracy, relevance and volatility will be analyzed and combined into an Agile-Indicator. The Agile-Indicator is derived in three steps. First of all, it is necessary to rate each internal and external factor in terms of the importance for the overall development task. Secondly, each requirement has to be evaluated for every single internal and external factor appropriate to their suitability for empirical-adaptive development. Finally, the total sums of internal and external side are composed in the Agile-Indicator. Thus, the Agile-Indicator constitutes a company-specific and application-related criterion, on which the allocation of empirical-adaptive and deterministic-normative development scopes can be made. In a last step, this indicator will be used for a specific clustering of development scopes by application of the fuzzy c-means (FCM) clustering algorithm. The FCM-method determines sub-clusters within functional clusters based on the empirical-adaptive environmental impact of the Agile-Indicator. By means of the methodology presented in this paper, it is possible to classify requirements, which are uncertainly carried out by the market, into empirical-adaptive or deterministic-normative development scopes.

Keywords: agile, highly iterative development, agile-indicator, product development

Procedia PDF Downloads 229
9621 A Comparative Study of Sampling-Based Uncertainty Propagation with First Order Error Analysis and Percentile-Based Optimization

Authors: M. Gulam Kibria, Shourav Ahmed, Kais Zaman

Abstract:

In system analysis, the information on the uncertain input variables cause uncertainty in the system responses. Different probabilistic approaches for uncertainty representation and propagation in such cases exist in the literature. Different uncertainty representation approaches result in different outputs. Some of the approaches might result in a better estimation of system response than the other approaches. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge (MUQC) has posed challenges about uncertainty quantification. Subproblem A, the uncertainty characterization subproblem, of the challenge posed is addressed in this study. In this subproblem, the challenge is to gather knowledge about unknown model inputs which have inherent aleatory and epistemic uncertainties in them with responses (output) of the given computational model. We use two different methodologies to approach the problem. In the first methodology we use sampling-based uncertainty propagation with first order error analysis. In the other approach we place emphasis on the use of Percentile-Based Optimization (PBO). The NASA Langley MUQC’s subproblem A is developed in such a way that both aleatory and epistemic uncertainties need to be managed. The challenge problem classifies each uncertain parameter as belonging to one the following three types: (i) An aleatory uncertainty modeled as a random variable. It has a fixed functional form and known coefficients. This uncertainty cannot be reduced. (ii) An epistemic uncertainty modeled as a fixed but poorly known physical quantity that lies within a given interval. This uncertainty is reducible. (iii) A parameter might be aleatory but sufficient data might not be available to adequately model it as a single random variable. For example, the parameters of a normal variable, e.g., the mean and standard deviation, might not be precisely known but could be assumed to lie within some intervals. It results in a distributional p-box having the physical parameter with an aleatory uncertainty, but the parameters prescribing its mathematical model are subjected to epistemic uncertainties. Each of the parameters of the random variable is an unknown element of a known interval. This uncertainty is reducible. From the study, it is observed that due to practical limitations or computational expense, the sampling is not exhaustive in sampling-based methodology. That is why the sampling-based methodology has high probability of underestimating the output bounds. Therefore, an optimization-based strategy to convert uncertainty described by interval data into a probabilistic framework is necessary. This is achieved in this study by using PBO.

Keywords: aleatory uncertainty, epistemic uncertainty, first order error analysis, uncertainty quantification, percentile-based optimization

Procedia PDF Downloads 224
9620 A Generalisation of Pearson's Curve System and Explicit Representation of the Associated Density Function

Authors: S. B. Provost, Hossein Zareamoghaddam

Abstract:

A univariate density approximation technique whereby the derivative of the logarithm of a density function is assumed to be expressible as a rational function is introduced. This approach which extends Pearson’s curve system is solely based on the moments of a distribution up to a determinable order. Upon solving a system of linear equations, the coefficients of the polynomial ratio can readily be identified. An explicit solution to the integral representation of the resulting density approximant is then obtained. It will be explained that when utilised in conjunction with sample moments, this methodology lends itself to the modelling of ‘big data’. Applications to sets of univariate and bivariate observations will be presented.

Keywords: density estimation, log-density, moments, Pearson's curve system

Procedia PDF Downloads 262
9619 Carotenoid Bioaccessibility: Effects of Food Matrix and Excipient Foods

Authors: Birgul Hizlar, Sibel Karakaya

Abstract:

Recently, increasing attention has been given to carotenoid bioaccessibility and bioavailability in the field of nutrition research. As a consequence of their lipophilic nature and their specific localization in plant-based tissues, carotenoid bioaccessibility and bioavailability is generally quite low in raw fruits and vegetables, since carotenoids need to be released from the cellular matrix and incorporated in the lipid fraction during digestion before being absorbed. Today’s approach related to improving the bioaccessibility is to design food matrix. Recently, the newest approach, excipient food, has been introduced to improve the bioavailability of orally administered bioactive compounds. The main idea is combining food and another food (the excipient food) whose composition and/or structure is specifically designed for improving health benefits. In this study, effects of food processing, food matrix and the addition of excipient foods on the carotenoid bioaccessibility of carrots were determined. Different excipient foods (olive oil, lemon juice and whey curd) and different food matrices (grating, boiling and mashing) were used. Total carotenoid contents of the grated, boiled and mashed carrots were 57.23, 51.11 and 62.10 μg/g respectively. No significant differences among these values indicated that these treatments had no effect on the release of carotenoids from the food matrix. Contrary to, changes in the food matrix, especially mashing caused significant increase in the carotenoid bioaccessibility. Although the carotenoid bioaccessibility was 10.76% in grated carrots, this value was 18.19% in mashed carrots (p<0.05). Addition of olive oil and lemon juice as excipients into the grated carrots caused 1.23 times and 1.67 times increase in the carotenoid content and the carotenoid bioaccessibility respectively. However, addition of the excipient foods in the boiled carrot samples did not influence the release of carotenoid from the food matrix. Whereas, up to 1.9 fold increase in the carotenoid bioaccessibility was determined by the addition of the excipient foods into the boiled carrots. The bioaccessibility increased from 14.20% to 27.12% by the addition of olive oil, lemon juice and whey curd. The highest carotenoid content among mashed carrots was found in the mashed carrots incorporated with olive oil and lemon juice. This combination also caused a significant increase in the carotenoid bioaccessibility from 18.19% to 29.94% (p<0.05). When compared the results related with the effect of the treatments on the carotenoid bioaccessibility, mashed carrots containing olive oil, lemon juice and whey curd had the highest carotenoid bioaccessibility. The increase in the bioaccessibility was approximately 81% when compared to grated and mashed samples containing olive oil, lemon juice and whey curd. In conclusion, these results demonstrated that the food matrix and addition of the excipient foods had a significant effect on the carotenoid content and the carotenoid bioaccessibility.

Keywords: carrot, carotenoids, excipient foods, food matrix

Procedia PDF Downloads 406
9618 A Numerical Study on Micromechanical Aspects in Short Fiber Composites

Authors: I. Ioannou, I. M. Gitman

Abstract:

This study focused on the contribution of micro-mechanical parameters on the macro-mechanical response of short fiber composites, namely polypropylene matrix reinforced by glass fibers. In the framework of this paper, an attention has been given to the glass fibers length, as micromechanical parameter influences the overall macroscopic material’s behavior. Three dimensional numerical models were developed and analyzed through the concept of a Representative Volume Element (RVE). Results of the RVE-based approach were compared with analytical Halpin-Tsai’s model.

Keywords: effective properties, homogenization, representative volume element, short fiber reinforced composites

Procedia PDF Downloads 252
9617 Evaluation of the CRISP-DM Business Understanding Step: An Approach for Assessing the Predictive Power of Regression versus Classification for the Quality Prediction of Hydraulic Test Results

Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter

Abstract:

Digitalisation in production technology is a driver for the application of machine learning methods. Through the application of predictive quality, the great potential for saving necessary quality control can be exploited through the data-based prediction of product quality and states. However, the serial use of machine learning applications is often prevented by various problems. Fluctuations occur in real production data sets, which are reflected in trends and systematic shifts over time. To counteract these problems, data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets to extract stable features. Successful process control of the target variables aims to centre the measured values around a mean and minimise variance. Competitive leaders claim to have mastered their processes. As a result, much of the real data has a relatively low variance. For the training of prediction models, the highest possible generalisability is required, which is at least made more difficult by this data availability. The implementation of a machine learning application can be interpreted as a production process. The CRoss Industry Standard Process for Data Mining (CRISP-DM) is a process model with six phases that describes the life cycle of data science. As in any process, the costs to eliminate errors increase significantly with each advancing process phase. For the quality prediction of hydraulic test steps of directional control valves, the question arises in the initial phase whether a regression or a classification is more suitable. In the context of this work, the initial phase of the CRISP-DM, the business understanding, is critically compared for the use case at Bosch Rexroth with regard to regression and classification. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. Suitable methods for leakage volume flow regression and classification for inspection decision are applied. Impressively, classification is clearly superior to regression and achieves promising accuracies.

Keywords: classification, CRISP-DM, machine learning, predictive quality, regression

Procedia PDF Downloads 131
9616 Efficient Modeling Technique for Microstrip Discontinuities

Authors: Nassim Ourabia, Malika Ourabia

Abstract:

A new and efficient method is presented for the analysis of arbitrarily shaped discontinuities. The technique obtains closed form expressions for the equivalent circuits which are used to model these discontinuities. Then it would be easy to handle and to characterize complicated structures like T and Y junctions, truncated junctions, arbitrarily shaped junctions, cascading junctions, and more generally planar multiport junctions. Another advantage of this method is that the edge line concept for arbitrary shape junctions operates with real parameters circuits. The validity of the method was further confirmed by comparing our results for various discontinuities (bend, filters) with those from HFSS as well as from other published sources.

Keywords: CAD analysis, contour integral approach, microwave circuits, s-parameters

Procedia PDF Downloads 498
9615 Resilient Leadership: An Analysis for Challenges, Transformation and Improvement of Organizational Climate in Gastronomic Companies

Authors: Margarita Santi Becerra Santiago

Abstract:

The following document addresses the descriptive analysis under the qualitative approach of resilient leadership that allows us to know the importance of the application of a new leadership model to face the new challenges within the gastronomic companies in Mexico. Likewise, to know the main factors that influence resilient leaders and companies to develop new skills to elaborate strategies that contribute to overcoming adversities and managing change. Adversities in a company always exist and challenge us to move and apply our knowledge to be competitive as well as to strengthen our work team through motivation to achieve efficiency and develop in a good organizational climate.

Keywords: challenges, efficiency, leadership, resilience skills

Procedia PDF Downloads 62
9614 Promoting Non-Formal Learning Mobility in the Field of Youth

Authors: Juha Kettunen

Abstract:

The purpose of this study is to develop a framework for the assessment of research and development projects. The assessment map is developed in this study based on the strategy map of the balanced scorecard approach. The assessment map is applied in a project that aims to reduce the inequality and risk of exclusion of young people from disadvantaged social groups. The assessment map denotes that not only funding but also necessary skills and qualifications should be carefully assessed in the implementation of the project plans so as to achieve the objectives of projects and the desired impact. The results of this study are useful for those who want to develop the implementation of the Erasmus+ Programme and the project teams of research and development projects.

Keywords: non-formal learning, youth work, social inclusion, innovation

Procedia PDF Downloads 281
9613 Navigating through Uncertainty: An Explorative Study of Managers’ Experiences in China-foreign Cooperative Higher Education

Authors: Qian Wang, Haibo Gu

Abstract:

To drive practical interpretations and applications of various policies in building the transnational education joint-ventures, middle managers learn to navigate through uncertainties and ambiguities. However, the current literature views very little about those middle managers’ experiences, perceptions, and practices. This paper takes the empirical approach and aims to uncover the middle managers’ experiences by conducting interviews, campus visits, and document analysis. Following the qualitative research method approach, the researchers gathered information from a mixture of fourteen foreign and Chinese managers. Their perceptions of the China-foreign cooperation in higher education and their perceived roles have offered important, valuable insights to this group of people’s attitudes and management performances. The diverse cultural and demographic backgrounds contributed to the significance of the study. There are four key findings. One, middle managers’ immediate micro-contexts and individual attitudes are the top two influential factors in managers’ performances. Two, the foreign middle managers showed a stronger sense of self-identity in risk-taking. Three, the Chinese middle managers preferred to see difficulties as part of their assigned responsibilities. Four, middle managers in independent universities demonstrated a stronger sense of belonging and fewer frustrations than middle managers in secondary institutes. The researchers propose that training for managers in a transnational educational setting should consider these discoveries when select fitting topics and content. In particular, middle managers should be better prepared to anticipate their everyday jobs in the micro-environment; hence, information concerning sponsor organizations’ working culture is as essential as knowing the national and local regulations, and socio-culture. Different case studies can help the managers to recognize and celebrate the diversity in transnational education. Situational stories can help them to become aware of the diverse and wide range of work contexts so that they will not feel to be left alone when facing challenges without relevant previous experience or training. Though this research is a case study based in the Chinese transnational higher education setting, the implications could be relevant and comparable to other transnational higher education situations and help to continue expanding the potential applications in this field.

Keywords: educational management, middle manager performance, transnational higher education

Procedia PDF Downloads 141
9612 Intracranial Hypotension: A Brief Review of the Pathophysiology and Diagnostic Algorithm

Authors: Ana Bermudez de Castro Muela, Xiomara Santos Salas, Silvia Cayon Somacarrera

Abstract:

The aim of this review is to explain what is the intracranial hypotension and its main causes, and also to approach to the diagnostic management in the different clinical situations, understanding radiological findings, and physiopathological substrate. An approach to the diagnostic management is presented: what are the guidelines to follow, the different tests available, and the typical findings. We review the myelo-CT and myelo-RM studies in patients with suspected CSF fistula or hypotension of unknown cause during the last 10 years in three centers. Signs of intracranial hypotension (subdural hygromas/hematomas, pachymeningeal enhancement, venous sinus engorgement, pituitary hyperemia, and lowering of the brain) that are evident in baseline CT and MRI are also sought. The intracranial hypotension is defined as a lower opening pressure of 6 cmH₂O. It is a relatively rare disorder with an annual incidence of 5 per 100.000, with a female to male ratio 2:1. The clinical features it’s an orthostatic headache, which is defined as development or aggravation of headache when patients move from a supine to an upright position and disappear or typically relieve after lay down. The etiology is a decrease in the amount of cerebrospinal fluid (CSF), usually by loss of it, either spontaneous or secondary (post-traumatic, post-surgical, systemic disease, post-lumbar puncture etc.) and rhinorrhea and/or otorrhea may exist. The pathophysiological mechanisms of hypotension and CSF hypertension are interrelated, as a situation of hypertension may lead to hypotension secondary to spontaneous CSF leakage. The diagnostic management of intracranial hypotension in our center includes, in the case of being spontaneous and without rhinorrhea and/or otorrhea and according to necessity, a range of available tests, which will be performed from less to more complex: cerebral CT, cerebral MRI and spine without contrast and CT/MRI with intrathecal contrast. If we are in a situation of intracranial hypotension with the presence of rhinorrhea/otorrhea, a sample can be obtained for the detection of b2-transferrin, which is found in the CSF physiologically, as well as sinus CT and cerebral MRI including constructive interference steady state (CISS) sequences. If necessary, cisternography studies are performed to locate the exact point of leakage. It is important to emphasize the significance of myelo-CT / MRI to establish the diagnosis and location of CSF leak, which is indispensable for therapeutic planning (whether surgical or not) in patients with more than one lesion or doubts in the baseline tests.

Keywords: cerebrospinal fluid, neuroradiology brain, magnetic resonance imaging, fistula

Procedia PDF Downloads 117
9611 Gene Expression Meta-Analysis of Potential Shared and Unique Pathways Between Autoimmune Diseases Under anti-TNFα Therapy

Authors: Charalabos Antonatos, Mariza Panoutsopoulou, Georgios K. Georgakilas, Evangelos Evangelou, Yiannis Vasilopoulos

Abstract:

The extended tissue damage and severe clinical outcomes of autoimmune diseases, accompanied by the high annual costs to the overall health care system, highlight the need for an efficient therapy. Increasing knowledge over the pathophysiology of specific chronic inflammatory diseases, namely Psoriasis (PsO), Inflammatory Bowel Diseases (IBD) consisting of Crohn’s disease (CD) and Ulcerative colitis (UC), and Rheumatoid Arthritis (RA), has provided insights into the underlying mechanisms that lead to the maintenance of the inflammation, such as Tumor Necrosis Factor alpha (TNF-α). Hence, the anti-TNFα biological agents pose as an ideal therapeutic approach. Despite the efficacy of anti-TNFα agents, several clinical trials have shown that 20-40% of patients do not respond to treatment. Nowadays, high-throughput technologies have been recruited in order to elucidate the complex interactions in multifactorial phenotypes, with the most ubiquitous ones referring to transcriptome quantification analyses. In this context, a random effects meta-analysis of available gene expression cDNA microarray datasets was performed between responders and non-responders to anti-TNFα therapy in patients with IBD, PsO, and RA. Publicly available datasets were systematically searched from inception to 10th of November 2020 and selected for further analysis if they assessed the response to anti-TNFα therapy with clinical score indexes from inflamed biopsies. Specifically, 4 IBD (79 responders/72 non-responders), 3 PsO (40 responders/11 non-responders) and 2 RA (16 responders/6 non-responders) datasetswere selected. After the separate pre-processing of each dataset, 4 separate meta-analyses were conducted; three disease-specific and a single combined meta-analysis on the disease-specific results. The MetaVolcano R package (v.1.8.0) was utilized for a random-effects meta-analysis through theRestricted Maximum Likelihood (RELM) method. The top 1% of the most consistently perturbed genes in the included datasets was highlighted through the TopConfects approach while maintaining a 5% False Discovery Rate (FDR). Genes were considered as Differentialy Expressed (DEGs) as those with P ≤ 0.05, |log2(FC)| ≥ log2(1.25) and perturbed in at least 75% of the included datasets. Over-representation analysis was performed using Gene Ontology and Reactome Pathways for both up- and down-regulated genes in all 4 performed meta-analyses. Protein-Protein interaction networks were also incorporated in the subsequentanalyses with STRING v11.5 and Cytoscape v3.9. Disease-specific meta-analyses detected multiple distinct pro-inflammatory and immune-related down-regulated genes for each disease, such asNFKBIA, IL36, and IRAK1, respectively. Pathway analyses revealed unique and shared pathways between each disease, such as Neutrophil Degranulation and Signaling by Interleukins. The combined meta-analysis unveiled 436 DEGs, 86 out of which were up- and 350 down-regulated, confirming the aforementioned shared pathways and genes, as well as uncovering genes that participate in anti-inflammatory pathways, namely IL-10 signaling. The identification of key biological pathways and regulatory elements is imperative for the accurate prediction of the patient’s response to biological drugs. Meta-analysis of such gene expression data could aid the challenging approach to unravel the complex interactions implicated in the response to anti-TNFα therapy in patients with PsO, IBD, and RA, as well as distinguish gene clusters and pathways that are altered through this heterogeneous phenotype.

Keywords: anti-TNFα, autoimmune, meta-analysis, microarrays

Procedia PDF Downloads 162
9610 An Optimization Modelling to Evaluate Flights Scheduling at Tourist Airports

Authors: Dimitrios J. Dimitriou

Abstract:

Airport’s serving a tourist destination are an essential counterpart of the tourist demand supply chain, and their productivity is related to the region’s attractiveness and is enhanced by the air transport business. In this paper, the evaluation framework of the scheduled flights between two tourist airports is taken into consideration. By adopting a systemic approach, the arrivals from an airport that its connectivity heavily depended on the departures of another major airport are reviewed. The methodology framework, based on inventory control theory and the numerical example, promotes the use of the modelling formulation. The results would be essential for comparison and exercising to other similar cases.

Keywords: airport connectivity, inventory control, optimization, optimum allocation

Procedia PDF Downloads 320