Search results for: multiple measures
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7891

Search results for: multiple measures

7321 Application of Integrated Marketing Communications-Multiple, Case Studies

Authors: Yichen Lin, Hsiao-Han Chen, Chi-Chen Jan

Abstract:

Since 1990, the research area of Integrated Marketing Communications (IMC) has been presented from a different perspective. With advances in information technology and the rise of consumer consciousness, businesses are in a competitive environment. There is an urgent need to adopt more profitable and effective integrated marketing strategies to increase core competitiveness. The goal of the company's sustainable management is to increase consumers' willingness to purchase and to maximize profits. This research uses six aspects of IMC, which includes awareness integration, unified image, database integration, customer-based integration, stakeholders-based integration, and evaluation integration to examine the role of marketing strategies in the strengths and weaknesses of the six components of integrated marketing communications, their effectiveness, the most important components and the most important components that need improvement. At the same time, social media such as FaceBook, Instagram, Youtube, Line, or even TikTok have become marketing tools which firms adopt them more and more frequently in the marketing strategy. In the end of 2019, the outbreak of COVID-19 did really affect the global industries. Lockdown policies also accelerated closure of brick-mentor stores worldwide. Online purchases rose dramatically. Hence, the effectiveness of online marketing will be essential to maintain the business. This study uses multiple-case studies to extend the effects of social media and IMC. Moreover, the study would also explore the differences of social media and IMC during COVID-19. Through literature review and multiple-case studies, it is found that using social media combined with IMC did really help companies expand their business and make good connections with stakeholders. One of previous studies also used system theory to explore the interrelationship among Integrated Marketing Communication, collaborative marketing, and global brand building. Even during pandemic, firms could still maintain the operation and connect with their customers more tightly.

Keywords: integration marketing communications, multiple-case studies, social media, system theory

Procedia PDF Downloads 200
7320 Isolation Enhancement of Compact Dual-Band Printed Multiple Input Multiple Output Antenna for WLAN Applications

Authors: Adham M. Salah, Tariq A. Nagem, Raed A. Abd-Alhameed, James M. Noras

Abstract:

Recently, the demand for wireless communications systems to cover more than one frequency band (multi-band) with high data rate has been increased for both fixed and mobile services. Multiple Input Multiple Output (MIMO) technology is one of the significant solutions for attaining these requirements and to achieve the maximum channel capacity of the wireless communications systems. The main issue associated with MIMO antennas especially in portable devices is the compact space between the radiating elements which leads to limit the physical separation between them. This issue exacerbates the performance of the MIMO antennas by increasing the mutual coupling between the radiating elements. In other words, the mutual coupling will be stronger if the radiating elements of the MIMO antenna are closer. This paper presents a low–profile dual-band (2×1) MIMO antenna that works at 2.4GHz, 5.3GHz and 5.8GHz for wireless local area networks (WLAN) applications. A neutralization line (NL) technique for enhancing the isolation has been used by introducing a strip line with a length of λg/4 at the isolation frequency (2.4GHz) between the radiating elements. The overall dimensions of the antenna are 33.5 x 36 x 1.6 mm³. The fabricated prototype shows a good agreement between the simulated and measured results. The antenna impedance bandwidths are 2.38–2.75 GHz and 4.4–6 GHz for the lower and upper band respectively; the reflection coefficient and mutual coupling are better than -25 dB in both lower and higher bands. The MIMO antenna performance characteristics are reported in terms of the scattering parameters, envelope correlation coefficient (ECC), total active reflection coefficient, capacity loss, antenna gain, and radiation patterns. Analysis of these characteristics indicates that the design is appropriate for the WLAN terminal applications.

Keywords: ECC, neutralization line, MIMO antenna, multi-band, mutual coupling, WLAN

Procedia PDF Downloads 116
7319 Trace Logo: A Notation for Representing Control-Flow of Operational Process

Authors: M. V. Manoj Kumar, Likewin Thomas, Annappa

Abstract:

Process mining research discipline bridges the gap between data mining and business process modeling and analysis, it offers the process-centric and end-to-end methods/techniques for analyzing information of real-world process detailed in operational event-logs. In this paper, we have proposed a notation called trace logo for graphically representing control-flow perspective (order of execution of activities) of process. A trace logo consists of a stack of activity names at each position, sizes of the activity name indicates their frequency in the traces and the total height of the activity depicts the information content of the position. A trace logo created from a set of aligned traces generated using Multiple Trace Alignment technique.

Keywords: consensus trace, process mining, multiple trace alignment, trace logo

Procedia PDF Downloads 332
7318 Modelling and Investigation of Phase Change Phenomena of Multiple Water Droplets

Authors: K. R. Sultana, K. Pope, Y. S. Muzychka

Abstract:

In recent years, the research of heat transfer or phase change phenomena of liquid water droplets experiences a growing interest in aircraft icing, power transmission line icing, marine icing and wind turbine icing applications. This growing interest speeding up the research from single to multiple droplet phenomena. Impingements of multiple droplets and the resulting solidification phenomena after impact on a very cold surface is computationally studied in this paper. The model used in the current study solves the flow equation, composed of energy balance and the volume fraction equations. The main aim of the study is to investigate the effects of several thermo-physical properties (density, thermal conductivity and specific heat) on droplets freezing. The outcome is examined by various important factors, for instance, liquid fraction, total freezing time, droplet temperature and total heat transfer rate in the interface region. The liquid fraction helps to understand the complete phase change phenomena during solidification. Temperature distribution and heat transfer rate help to demonstrate the overall thermal exchange behaviors between the droplets and substrate surface. Findings of this research provide an important technical achievement for ice modeling and prediction studies.

Keywords: droplets, CFD, thermos-physical properties, solidification

Procedia PDF Downloads 225
7317 Improved Imaging and Tracking Algorithm for Maneuvering Extended UAVs Using High-Resolution ISAR Radar System

Authors: Mohamed Barbary, Mohamed H. Abd El-Azeem

Abstract:

Maneuvering extended object tracking (M-EOT) using high-resolution inverse synthetic aperture radar (ISAR) observations has been gaining momentum recently. This work presents a new robust implementation of the multiple models (MM) multi-Bernoulli (MB) filter for M-EOT, where the M-EOT’s ISAR observations are characterized using a skewed (SK) non-symmetrically normal distribution. To cope with the possible abrupt change of kinematic state, extension, and observation distribution over an extended object when a target maneuvers, a multiple model technique is represented based on MB-track-before-detect (TBD) filter supported by SK-sub-random matrix model (RMM) or sub-ellipses framework. Simulation results demonstrate this remarkable impact.

Keywords: maneuvering extended objects, ISAR, skewed normal distribution, sub-RMM, MM-MB-TBD filter

Procedia PDF Downloads 56
7316 Efficiency Improvement of REV-Method for Calibration of Phased Array Antennas

Authors: Daniel Hristov

Abstract:

The paper describes the principle of operation, simulation and physical validation of method for simultaneous acquisition of gain and phase states of multiple antenna elements and the corresponding feed lines across a Phased Array Antenna (PAA). The derived values for gain and phase are used for PAA-calibration. The method utilizes the Rotating-Element Electric- Field Vector (REV) principle currently used for gain and phase state estimation of single antenna element across an active antenna aperture. A significant reduction of procedure execution time is achieved with simultaneous setting of different phase delays to multiple phase shifters, followed by a single power measurement. The initial gain and phase states are calculated using spectral and correlation analysis of the measured power series.

Keywords: antenna, antenna arrays, calibration, phase measurement, power measurement

Procedia PDF Downloads 118
7315 Semi-Automatic Method to Assist Expert for Association Rules Validation

Authors: Amdouni Hamida, Gammoudi Mohamed Mohsen

Abstract:

In order to help the expert to validate association rules extracted from data, some quality measures are proposed in the literature. We distinguish two categories: objective and subjective measures. The first one depends on a fixed threshold and on data quality from which the rules are extracted. The second one consists on providing to the expert some tools in the objective to explore and visualize rules during the evaluation step. However, the number of extracted rules to validate remains high. Thus, the manually mining rules task is very hard. To solve this problem, we propose, in this paper, a semi-automatic method to assist the expert during the association rule's validation. Our method uses rule-based classification as follow: (i) We transform association rules into classification rules (classifiers), (ii) We use the generated classifiers for data classification. (iii) We visualize association rules with their quality classification to give an idea to the expert and to assist him during validation process.

Keywords: association rules, rule-based classification, classification quality, validation

Procedia PDF Downloads 415
7314 Effects of Macroprudential Policies on BankLending and Risks

Authors: Stefanie Behncke

Abstract:

This paper analyses the effects of different macroprudential policy measures that have recently been implemented in Switzerland. Among them is the activation and the increase of the countercyclical capital buffer (CCB) and a tightening of loan-to-value (LTV) requirements. These measures were introduced to limit systemic risks in the Swiss mortgage and real estate markets. They were meant to affect mortgage growth, mortgage risks, and banks’ capital buffers. Evaluation of their quantitative effects provides insights for Swiss policymakers when reassessing their policy. It is also informative for policymakers in other countries who plan to introduce macroprudential instruments. We estimate the effects of the different macroprudential measures with a Differences-in-Differences estimator. Banks differ with respect to the relative importance of mortgages in their portfolio, their riskiness, and their capital buffers. Thus, some of the banks were more affected than others by the CCB, while others were more affected by the LTV requirements. Our analysis is made possible by an unusually informative bank panel data set. It combines data on newly issued mortgage loans and quantitative risk indicators such as LTV and loan-to-income (LTI) ratios with supervisory information on banks’ capital and liquidity situation and balance sheets. Our results suggest that the LTV cap of 90% was most effective. The proportion of new mortgages with a high LTV ratio was significantly reduced. This result does not only apply to the 90% LTV, but also to other threshold values (e.g. 80%, 75%) suggesting that the entire upper part of the LTV distribution was affected. Other outcomes such as the LTI distribution, the growth rates of mortgages and other credits, however, were not significantly affected. Regarding the activation and the increase of the CCB, we do not find any significant effects: neither LTV/LTI risk parameters nor mortgage and other credit growth rates were significantly reduced. This result may reflect that the size of the CCB (1% of relevant residential real estate risk-weighted assets at activation, respectively 2% at the increase) was not sufficiently high enough to trigger a distinct reaction between the banks most likely to be affected by the CCB and those serving as controls. Still, it might be have been effective in increasing the resilience in the overall banking system. From a policy perspective, these results suggest that targeted macroprudential policy measures can contribute to financial stability. In line with findings by others, caps on LTV reduced risk taking in Switzerland. To fully assess the effectiveness of the CCB, further experience is needed.

Keywords: banks, financial stability, macroprudential policy, mortgages

Procedia PDF Downloads 339
7313 Selection of Intensity Measure in Probabilistic Seismic Risk Assessment of a Turkish Railway Bridge

Authors: M. F. Yilmaz, B. Ö. Çağlayan

Abstract:

Fragility curve is an effective common used tool to determine the earthquake performance of structural and nonstructural components. Also, it is used to determine the nonlinear behavior of bridges. There are many historical bridges in the Turkish railway network; the earthquake performances of these bridges are needed to be investigated. To derive fragility curve Intensity measures (IMs) and Engineering demand parameters (EDP) are needed to be determined. And the relation between IMs and EDP are needed to be derived. In this study, a typical simply supported steel girder riveted railway bridge is studied. Fragility curves of this bridge are derived by two parameters lognormal distribution. Time history analyses are done for selected 60 real earthquake data to determine the relation between IMs and EDP. Moreover, efficiency, practicality, and sufficiency of three different IMs are discussed. PGA, Sa(0.2s) and Sa(1s), the most common used IMs parameters for fragility curve in the literature, are taken into consideration in terms of efficiency, practicality and sufficiency.

Keywords: railway bridges, earthquake performance, fragility analyses, selection of intensity measures

Procedia PDF Downloads 335
7312 Water Management of Polish Agriculture and Adaptation to Climate Change

Authors: Dorota M. Michalak

Abstract:

The agricultural sector, due to the growing demand for food and over-exploitation of the natural environment, contributes to the deepening of climate change, on the one hand, and on the other hand, shrinking freshwater resources, as a negative effect of climate change, threaten the food security of each country. Therefore, adaptation measures to climate change should take into account effective water management and seek solutions ensuring food production at an unchanged or higher level, while not burdening the environment and not contributing to the worsening of the negative consequences of climate change. The problems of Poland's water management result not only from relatively small, natural water resources but to a large extent on the low efficiency of their use. Appropriate agricultural practices and state solutions in this field can contribute to achieving significant benefits in terms of economical water management in agriculture, providing a greater amount of water that could also be used for other purposes, including for purposes related to environmental protection. The aim of the article is to determine the level of use of water resources in Polish agriculture and the advancement of measures aimed at adapting Polish agriculture in the field of water management to climate change. The study provides knowledge about Polish legal regulations and water management tools, the shaping of water policy of Polish agriculture against the background of EU countries and other sources of energy, and measures supporting Polish agricultural holdings in the effective management of water resources run by state budget institutions. In order to achieve the above-mentioned goals, the author used research tools such as the analysis of existing sources and a survey conducted among five groups of entities, i.e. agricultural advisory centers and departments, agricultural, rural and environmental protection departments, regional water management boards, provincial agricultural chambers and restructuring and modernization of agriculture. The main conclusion of the analyses carried out is the low use of water in Polish agriculture in relation to other EU countries, other sources of intake in Poland, as well as irrigation. The analysis allows us to observe another problem, which is the lack of reporting and data collection, which is extremely important from the point of view of the effectiveness of adaptation measures to climate change. The results obtained from the survey indicate a very low level of support for government institutions in the implementation of adaptation measures to climate change and the water management of Polish farms. Some of the basic problems of the adaptation policy to change climate with regard to water management in Polish agriculture include a lack of knowledge regarding climate change, the possibilities of adapting, the available tools or ways to rationalize the use of water resources. It also refers to the lack of ordering procedures and the separation of responsibility with a proper territorial unit, non-functioning channels of information flow and practically low effects.

Keywords: water management, adaptation policy, agriculture, climate change

Procedia PDF Downloads 122
7311 Political will in Fighting Corruption in Vietnam

Authors: Anh Dao Vu, Bill Ryan

Abstract:

The Vietnamese government struggles to grapple with the problem of rampant corruption, one of the most challenging difficulties the country faces. According to Transparency International’s Corruption Perceptions Index (CPI) 2014, Vietnam ranks 119 out of 175 countries. The CPI gives Vietnam a score of 31 on a scale from 0 to 100, where 0 indicates ‘highly corrupt’ and 100 represents ‘very clean’. Corruption eats into the national GDP of Vietnam, causing a loss of 3% to 4% of GDP per annum. In general, the Vietnamese people’s trust in their government to wage an effective fight against corruption, especially in the public sector, has been greatly eroded in recent years. Some substantial public demonstrations persuaded the government to implement strong anti-corruption measures. However, so far those measures have not been particularly successful. One of the main reasons for this shortcoming is that neither the Communist Party of Vietnam nor the government has demonstrated sufficiently strong ‘political will’ in fighting corruption. There remains a large gap between rhetoric and reality. This paper will examine the reasons why insufficient ‘political will’ is displayed in the ostensible fight against public sector corruption, and how certain anti-corruption strategies will both strengthen levels of political commitment to the fight against corruption while enhancing the effectiveness of that essential national endeavor.

Keywords: corruption, political will, Vietnam, anti-corruption

Procedia PDF Downloads 302
7310 Reducing Ambulance Offload Delay: A Quality Improvement Project at Princess Royal University Hospital

Authors: Fergus Wade, Jasmine Makker, Matthew Jankinson, Aminah Qamar, Gemma Morrelli, Shayan Shah

Abstract:

Background: Ambulance offload delays (AODs) affect patient outcomes. At baseline, the average AOD at Princess Royal University Hospital (PRUH) was 41 minutes, in breach of the 15-minute target. Aims: By February 2023, we aimed to reduce: the average AOD to 30 minutes percentage of AOD >30 minutes (PA30) to 25% and >60 minutes (PA60) to 10% Methods: Following a root-cause analysis, we implemented 2 Plan, Do, Study, Act (PDSA) cycles. PDSA-1 ‘Drop-and-run’: ambulances waiting >15 minutes for a handover left the patients in the Emergency Department (ED) and returned to the community. PDSA-2: Booking in the patients before the handover, allowing direct updates to online records, eliminating the need for handwritten notes. Outcome measures: AOD, PA30, and PA60, and process measures: total ambulances and patients in the ED were recorded for 16 weeks. Results: In PDSA-1, all parameters increased slightly despite unvarying ED crowding. In PDSA-2, two shifts in data were seen: initially, a sharp increase in the outcome measures consistent with increased ED crowding, followed by a downward shift when crowding returned to baseline (p<0.01). Within this interval, the AOD reduced to 29.9 minutes, and PA30 and PA60 were 31.2% and 9.2% respectively. Discussion/conclusion: PDSA-1 didn’t result in any significant changes; lack of compliance was a key cause. The initial upward shift in PDSA-2 is likely associated with NHS staff strikes. However, during the second interval, the AOD and the PA60 met our targets of 30 minutes and 10%, respectively, improving patient flow in the ED. This was sustained without further input and if maintained, saves 2 paramedic shifts every 3 days.

Keywords: ambulance offload, district general hospital, handover, quality improvement

Procedia PDF Downloads 78
7309 Using Axiomatic Design for Developing a Framework of Manufacturing Cloud Service Composition in the Equilibrium State

Authors: Ehsan Vaziri Goodarzi, Mahmood Houshmand, Omid Fatahi Valilai, Vahidreza Ghezavati, Shahrooz Bamdad

Abstract:

One important paradigm of industry 4.0 is Cloud Manufacturing (CM). In CM everything is considered as a service, therefore, the CM platform should consider all service provider's capabilities and tries to integrate services in an equilibrium state. This research develops a framework for implementing manufacturing cloud service composition in the equilibrium state. The developed framework using well-known tools called axiomatic design (AD) and game theory. The research has investigated the factors for forming equilibrium for measures of the manufacturing cloud service composition. Functional requirements (FRs) represent the measures of manufacturing cloud service composition in the equilibrium state. These FRs satisfied by related Design Parameters (DPs). The FRs and DPs are defined by considering the game theory, QoS, consumer needs, parallel and cooperative services. Ultimately, four FRs and DPs represent the framework. To insure the validity of the framework, the authors have used the first AD’s independent axiom.

Keywords: axiomatic design, manufacturing cloud service composition, cloud manufacturing, industry 4.0

Procedia PDF Downloads 157
7308 Simulation Analysis of Wavelength/Time/Space Codes Using CSRZ and DPSK-RZ Formats for Fiber-Optic CDMA Systems

Authors: Jaswinder Singh

Abstract:

In this paper, comparative analysis is carried out to study the performance of wavelength/time/space optical CDMA codes using two well-known formats; those are CSRZ and DPSK-RZ using RSoft’s OptSIM. The analysis is carried out under the real-like scenario considering the presence of various non-linear effects such as XPM, SPM, SRS, SBS and FWM. Fiber dispersion and the multiple access interference are also considered. The codes used in this analysis are 3-D wavelength/time/space codes. These are converted into 2-D wavelength-time codes so that their requirement of space couplers and fiber ribbons is eliminated. Under the conditions simulated, this is found that CSRZ performs better than DPSK-RZ for fiber-optic CDMA applications.

Keywords: Optical CDMA, Multiple access interference (MAI), CSRZ, DPSK-RZ

Procedia PDF Downloads 624
7307 A Highly Efficient Broadcast Algorithm for Computer Networks

Authors: Ganesh Nandakumaran, Mehmet Karaata

Abstract:

A wave is a distributed execution, often made up of a broadcast phase followed by a feedback phase, requiring the participation of all the system processes before a particular event called decision is taken. Wave algorithms with one initiator such as the 1-wave algorithm have been shown to be very efficient for broadcasting messages in tree networks. Extensions of this algorithm broadcasting a sequence of waves using a single initiator have been implemented in algorithms such as the m-wave algorithm. However as the network size increases, having a single initiator adversely affects the message delivery times to nodes further away from the initiator. As a remedy, broadcast waves can be allowed to be initiated by multiple initiator nodes distributed across the network to reduce the completion time of broadcasts. These waves initiated by one or more initiator processes form a collection of waves covering the entire network. Solutions to global-snapshots, distributed broadcast and various synchronization problems can be solved efficiently using waves with multiple concurrent initiators. In this paper, we propose the first stabilizing multi-wave sequence algorithm implementing waves started by multiple initiator processes such that every process in the network receives at least one sequence of broadcasts. Due to being stabilizing, the proposed algorithm can withstand transient faults and do not require initialization. We view a fault as a transient fault if it perturbs the configuration of the system but not its program.

Keywords: distributed computing, multi-node broadcast, propagation of information with feedback and cleaning (PFC), stabilization, wave algorithms

Procedia PDF Downloads 480
7306 Effectiveness of an Early Intensive Behavioral Intervention Program on Infants with Autism Spectrum Disorder

Authors: Dongjoo Chin

Abstract:

The purpose of this study was to investigate the effectiveness of an Early Intensive Behavioral Intervention (EIBI) program on infants with autism spectrum disorder (ASD) and to explore the factors predicting the effectiveness of the program, focusing on the infant's age, language ability, problem behaviors, and parental stress. 19 pairs of infants aged between 2 and 5 years who have had been diagnosed with ASD, and their parents participated in an EIBI program at a clinic providing evidence-based treatment based on applied behavior analysis. The measurement tools which were administered before and after the EIBI program and compared, included PEP-R, a curriculum evaluation, K-SIB-R, K-Vineland-II, K-CBCL, and PedsQL for the infants, and included PSI-SF and BDI-II for the parents. Statistical analysis was performed using a sample t-test and multiple regression analysis and the results were as follows. The EIBI program showed significant improvements in overall developmental age, curriculum assessment, and quality of life for infants. There was no difference in parenting stress or depression. Furthermore, measures for both children and parents at the start of the program predicted neither PEP-R nor the degree of improvement in curriculum evaluation measured six months later at the end of the program. Based on these results, the authors suggest future directions for developing an effective intensive early intervention (EIBI) program for infants with ASD in Korea, and discuss the implications and limitations of this study.

Keywords: applied behavior analysis, autism spectrum disorder, early intensive behavioral intervention, parental stress

Procedia PDF Downloads 158
7305 Vendor Selection and Supply Quotas Determination by Using Revised Weighting Method and Multi-Objective Programming Methods

Authors: Tunjo Perič, Marin Fatović

Abstract:

In this paper a new methodology for vendor selection and supply quotas determination (VSSQD) is proposed. The problem of VSSQD is solved by the model that combines revised weighting method for determining the objective function coefficients, and a multiple objective linear programming (MOLP) method based on the cooperative game theory for VSSQD. The criteria used for VSSQD are: (1) purchase costs and (2) product quality supplied by individual vendors. The proposed methodology is tested on the example of flour purchase for a bakery with two decision makers.

Keywords: cooperative game theory, multiple objective linear programming, revised weighting method, vendor selection

Procedia PDF Downloads 333
7304 Finding the Right Regulatory Path for Islamic Banking

Authors: Meysam Saidi

Abstract:

While the specific externalities and required regulatory measures in relation to Islamic banking are fairly uncertain, the business is growing across the world. Unofficial data indicate that the Islamic Finance market is growing with annual rate of 15% and it has reached 1.3 $ trillion size. This trend is associated with inherent systematic connection of Islamic financial institutions to other entities and different sectors of economies. Islamic banking has been subject of market development policies in major economies, most notably the UK. This trend highlights the need for identification of distinct risk features of Islamic banking and crafting customized regulatory measures. So far there has not been a significant systemic crisis in this market which can be attributed to its distinct nature. However, the significant growth and spread of its products worldwide necessitate an in depth study of its nature for customized congruent regulatory measures. In the post financial crisis era some market analysis and reports suggested that the Islamic banks fairly weathered the crisis. As far as heavily blamed conventional financial products such as subprime mortgage backed securities and speculative credit default swaps were concerned the immunity claim can be considered true, as Islamic financial institutions were not directly exposed to such products. Nevertheless, similar to the experience of the conventional banking industry, it can be only a matter of time for Islamic banks to face failures that can be specific to the nature of their business. Using the experience of conventional banking regulations and identifying those peculiarities of Islamic banking that need customized regulatory approach can aid to prevent major failures. Frank Knight has stated that “We perceive the world before we react to it, and we react not to what we perceive, but always to what we infer”. The debate over congruent Islamic banking regulations might not be an exception to Frank Knight’s statement but I will try to base my discussion on concrete evidences. This paper first analyzes both theoretical and actual features of Islamic banking in order to ascertain to its peculiarities in terms of market stability and other externalities. Next, the paper discusses distinct features of Islamic financial transactions and banking which might require customized regulatory measures. Finally, the paper explores how a more transparent path for the Islamic banking regulations can be drawn.

Keywords: Islamic banking, regulation, risks, capital requirements, customer protection, financial stability

Procedia PDF Downloads 387
7303 Bioinformatics Approach to Identify Physicochemical and Structural Properties Associated with Successful Cell-free Protein Synthesis

Authors: Alexander A. Tokmakov

Abstract:

Cell-free protein synthesis is widely used to synthesize recombinant proteins. It allows genome-scale expression of various polypeptides under strictly controlled uniform conditions. However, only a minor fraction of all proteins can be successfully expressed in the systems of protein synthesis that are currently used. The factors determining expression success are poorly understood. At present, the vast volume of data is accumulated in cell-free expression databases. It makes possible comprehensive bioinformatics analysis and identification of multiple features associated with successful cell-free expression. Here, we describe an approach aimed at identification of multiple physicochemical and structural properties of amino acid sequences associated with protein solubility and aggregation and highlight major correlations obtained using this approach. The developed method includes: categorical assessment of the protein expression data, calculation and prediction of multiple properties of expressed amino acid sequences, correlation of the individual properties with the expression scores, and evaluation of statistical significance of the observed correlations. Using this approach, we revealed a number of statistically significant correlations between calculated and predicted features of protein sequences and their amenability to cell-free expression. It was found that some of the features, such as protein pI, hydrophobicity, presence of signal sequences, etc., are mostly related to protein solubility, whereas the others, such as protein length, number of disulfide bonds, content of secondary structure, etc., affect mainly the expression propensity. We also demonstrated that amenability of polypeptide sequences to cell-free expression correlates with the presence of multiple sites of post-translational modifications. The correlations revealed in this study provide a plethora of important insights into protein folding and rationalization of protein production. The developed bioinformatics approach can be of practical use for predicting expression success and optimizing cell-free protein synthesis.

Keywords: bioinformatics analysis, cell-free protein synthesis, expression success, optimization, recombinant proteins

Procedia PDF Downloads 391
7302 Adaptive Filtering in Subbands for Supervised Source Separation

Authors: Bruna Luisa Ramos Prado Vasques, Mariane Rembold Petraglia, Antonio Petraglia

Abstract:

This paper investigates MIMO (Multiple-Input Multiple-Output) adaptive filtering techniques for the application of supervised source separation in the context of convolutive mixtures. From the observation that there is correlation among the signals of the different mixtures, an improvement in the NSAF (Normalized Subband Adaptive Filter) algorithm is proposed in order to accelerate its convergence rate. Simulation results with mixtures of speech signals in reverberant environments show the superior performance of the proposed algorithm with respect to the performances of the NLMS (Normalized Least-Mean-Square) and conventional NSAF, considering both the convergence speed and SIR (Signal-to-Interference Ratio) after convergence.

Keywords: adaptive filtering, multi-rate processing, normalized subband adaptive filter, source separation

Procedia PDF Downloads 410
7301 Stand Alone Multiple Trough Solar Desalination with Heat Storage

Authors: Abderrahmane Diaf, Kamel Benabdellaziz

Abstract:

Remote arid areas of the vast expanses of the African deserts hold huge subterranean reserves of brackish water resources waiting for economic development. This work presents design guidelines as well as initial performance data of new autonomous solar desalination equipment which could help local communities produce their own fresh water using solar energy only and, why not, contribute to transforming desert lands into lush gardens. The output of solar distillation equipment is typically low and in the range of 3 l/m2/day on the average. This new design with an integrated, water-based, environmentally-friendly solar heat storage system produced 5 l/m2/day in early spring weather. Equipment output during summer exceeded 9 liters per m2 per day.

Keywords: multiple trough distillation, solar desalination, solar distillation with heat storage, water based heat storage system

Procedia PDF Downloads 416
7300 Capacitated Multiple Allocation P-Hub Median Problem on a Cluster Based Network under Congestion

Authors: Çağrı Özgün Kibiroğlu, Zeynep Turgut

Abstract:

This paper considers a hub location problem where the network service area partitioned into predetermined zones (represented by node clusters is given) and potential hub nodes capacity levels are determined a priori as a selection criteria of hub to investigate congestion effect on network. The objective is to design hub network by determining all required hub locations in the node clusters and also allocate non-hub nodes to hubs such that the total cost including transportation cost, opening cost of hubs and penalty cost for exceed of capacity level at hubs is minimized. A mixed integer linear programming model is developed introducing additional constraints to the traditional model of capacitated multiple allocation hub location problem and empirically tested.

Keywords: hub location problem, p-hub median problem, clustering, congestion

Procedia PDF Downloads 471
7299 Failure Mode Analysis of a Multiple Layer Explosion Bonded Cryogenic Transition Joint

Authors: Richard Colwell, Thomas Englert

Abstract:

In cryogenic liquefaction processes, brazed aluminum core heat exchangers are used to minimize surface area/volume of the exchanger. Aluminum alloy (5083-H321; UNS A95083) piping must transition to higher melting point 304L stainless steel piping outside of the heat exchanger kettle or cold box for safety reasons. Since aluminum alloys and austenitic stainless steel cannot be directly welded to together, a transition joint consisting of 5 layers of different metals explosively bonded are used. Failures of two of these joints resulted in process shut-down and loss of revenue. Failure analyses, FEA analysis, and mock-up testing were performed by multiple teams to gain a further understanding into the failure mechanisms involved.

Keywords: explosion bonding, intermetallic compound, thermal strain, titanium-nickel Interface

Procedia PDF Downloads 188
7298 The Level of Administrative Creativity and Its Obstacles From the Point of View of Workers in Youth Centers in Jordan

Authors: Basheer Ahmad Al-Alwan

Abstract:

This study aimed to assess the extent of administrative creativity and identify its barriers from the perspective of employees working in youth centers in Jordan. The sample comprised 156 individuals employed in youth centers within the Hashemite Kingdom of Jordan. Data collection involved the utilization of two measures: the administrative creativity scale and the obstacles to administrative work scale. Correlation and stepwise multiple regression analyses were conducted. The findings revealed a high level of administrative creativity, as indicated by a mean score of 3.82 and a standard deviation of 0.51. Furthermore, statistically significant gender-based differences in administrative creativity were observed, favoring males, with a mean score of 3.32 for males compared to 2.91 for females. The results also demonstrated statistically significant differences in the level of administrative creativity based on experience, with the highest mean score observed for individuals with 5 to less than 10 years of experience. Regarding the obstacles to administrative creativity, the findings revealed an average level, with a mean score of 2.86 and a standard deviation of 0.791. Based on these results, the study recommends the promotion of a culture of creativity among employees and the provision of a broader scope of authority to foster an environment conducive to administrative creativity. Additionally, it suggests offering training courses encompassing the annual plan for these centers and minimizing obstacles that hinder the creative process among employees in Jordanian youth centers.

Keywords: administrative creativity, obstacles, workers in youth centers, Jordan

Procedia PDF Downloads 62
7297 Assessing the Effects of Sub-Concussive Head Impacts on Clinical Measures of Neurologic Function

Authors: Gianluca Del Rossi

Abstract:

Sub-concussive impacts occur frequently in collision sports such as American tackle football. Sub-concussive level impacts are defined as hits to the head that do not result in the clinical manifestation of concussion injury. Presently, there is limited information known about the short-term effects of repeated sub-concussive blows to the head. Therefore, the purpose of this investigation was to determine if standard clinical measures could detect acute impairments in neurologic function resulting from the accumulation of sub-concussive impacts throughout a season of high school American tackle football. Simple reaction time using the ruler-drop test, and oculomotor performance using the King-Devick (KD) test, were assessed in 15 athletes prior to the start of the athletic season, then repeated each week of the season, and once following its completion. The mean reaction times and fastest KD scores that were recorded or calculated from each study participant and from each test session were analyzed to assess for change in reaction time and oculomotor performance over the course of the American tackle football season. Analyses of KD data revealed improvements in oculomotor performance from baseline measurements (i.e., decreased time), with most weekly comparisons to baseline being significantly different. Statistical tests performed on the mean reaction times obtained via the ruler-drop test throughout the season revealed statistically significant declines (i.e., increased time) between baseline and weeks 3, 4, 10, and 12 of the athletic season. The inconsistent and contrasting findings between KD data and reaction time demonstrate the need to identify more robust clinical measures to definitively assess if repeated sub-concussive impacts to the head are acutely detrimental to patients.

Keywords: head injury, mTBI and sport, subclinical head trauma, sub-concussive impacts

Procedia PDF Downloads 155
7296 Defining Affecting Factors on Rate of Car E-Customers' Satisfaction – a Case Study of Iran Khodro Co.

Authors: Majid Mohammadi, Mohammad Yosef Zadeh, Vahid Naderi Darshori

Abstract:

The main purpose of this research is concreting of satisfaction literature for obtain index with online content in carmaker industry. The study measures customer satisfaction of online and collect from similar studies with reference to a model of online satisfaction, they are attempting to complete. Statistical communities of research are online customers' carmaker Iran Khodro has been buying the company's products in the last six months. One of the innovative measures in this study is that, customer reviews are obtained through an Internet site. Reliability of the data collected in this study, the Cronbach's alpha coefficient was approved. The coefficient of 0.828 was calculated for the questionnaire. To test the hypothesis, the Pearson correlation coefficient was used. To ensure the correctness of initial theoretical model, we used regression analyzes and structural equation weight and finally, the results obtained with little change to the basic model of research, are improved and completed. At last obtain the perceived value has most direct effect on online car customers satisfaction.

Keywords: customer satisfaction, online satisfaction, online customer, car

Procedia PDF Downloads 387
7295 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective

Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou

Abstract:

The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.

Keywords: mortality map, spatial patterns, statistical area, variation

Procedia PDF Downloads 236
7294 Applying Multiple Kinect on the Development of a Rapid 3D Mannequin Scan Platform

Authors: Shih-Wen Hsiao, Yi-Cheng Tsao

Abstract:

In the field of reverse engineering and creative industries, applying 3D scanning process to obtain geometric forms of the objects is a mature and common technique. For instance, organic objects such as faces and nonorganic objects such as products could be scanned to acquire the geometric information for further application. However, although the data resolution of 3D scanning device is increasing and there are more and more abundant complementary applications, the penetration rate of 3D scanning for the public is still limited by the relative high price of the devices. On the other hand, Kinect, released by Microsoft, is known for its powerful functions, considerably low price, and complete technology and database support. Therefore, related studies can be done with the applying of Kinect under acceptable cost and data precision. Due to the fact that Kinect utilizes optical mechanism to extracting depth information, limitations are found due to the reason of the straight path of the light. Thus, various angles are required sequentially to obtain the complete 3D information of the object when applying a single Kinect for 3D scanning. The integration process which combines the 3D data from different angles by certain algorithms is also required. This sequential scanning process costs much time and the complex integration process often encounter some technical problems. Therefore, this paper aimed to apply multiple Kinects simultaneously on the field of developing a rapid 3D mannequin scan platform and proposed suggestions on the number and angles of Kinects. In the content, a method of establishing the coordination based on the relation between mannequin and the specifications of Kinect is proposed, and a suggestion of angles and number of Kinects is also described. An experiment of applying multiple Kinect on the scanning of 3D mannequin is constructed by Microsoft API, and the results show that the time required for scanning and technical threshold can be reduced in the industries of fashion and garment design.

Keywords: 3D scan, depth sensor, fashion and garment design, mannequin, multiple Kinect sensor

Procedia PDF Downloads 347
7293 Non-Methane Hydrocarbons Emission during the Photocopying Process

Authors: Kiurski S. Jelena, Aksentijević M. Snežana, Kecić S. Vesna, Oros B. Ivana

Abstract:

The prosperity of electronic equipment in photocopying environment not only has improved work efficiency, but also has changed indoor air quality. Considering the number of photocopying employed, indoor air quality might be worse than in general office environments. Determining the contribution from any type of equipment to indoor air pollution is a complex matter. Non-methane hydrocarbons are known to have an important role of air quality due to their high reactivity. The presence of hazardous pollutants in indoor air has been detected in one photocopying shop in Novi Sad, Serbia. Air samples were collected and analyzed for five days, during 8-hr working time in three-time intervals, whereas three different sampling points were determined. Using multiple linear regression model and software package STATISTICA 10 the concentrations of occupational hazards and micro-climates parameters were mutually correlated. Based on the obtained multiple coefficients of determination (0.3751, 0.2389, and 0.1975), a weak positive correlation between the observed variables was determined. Small values of parameter F indicated that there was no statistically significant difference between the concentration levels of non-methane hydrocarbons and micro-climates parameters. The results showed that variable could be presented by the general regression model: y = b0 + b1xi1+ b2xi2. Obtained regression equations allow to measure the quantitative agreement between the variation of variables and thus obtain more accurate knowledge of their mutual relations.

Keywords: non-methane hydrocarbons, photocopying process, multiple regression analysis, indoor air quality, pollutant emission

Procedia PDF Downloads 356
7292 Performance Comparison of Joint Diagonalization Structure (JDS) Method and Wideband MUSIC Method

Authors: Sandeep Santosh, O. P. Sahu

Abstract:

We simulate an efficient multiple wideband and nonstationary source localization algorithm by exploiting both the non-stationarity of the signals and the array geometric information.This algorithm is based on joint diagonalization structure (JDS) of a set of short time power spectrum matrices at different time instants of each frequency bin. JDS can be used for quick and accurate multiple non-stationary source localization. The JDS algorithm is a one stage process i.e it directly searches the Direction of arrivals (DOAs) over the continuous location parameter space. The JDS method requires that the number of sensors is not less than the number of sources. By observing the simulation results, one can conclude that the JDS method can localize two sources when their difference is not less than 7 degree but the Wideband MUSIC is able to localize two sources for difference of 18 degree.

Keywords: joint diagonalization structure (JDS), wideband direction of arrival (DOA), wideband MUSIC

Procedia PDF Downloads 444