Search results for: Real- Time Voice Encryption
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7755

Search results for: Real- Time Voice Encryption

3795 Research on the Methodologies of the Opportune Innovation - A Case Study of BYD

Authors: Guangjie Liu

Abstract:

The main purpose of this paper is to research on the methodologies of BYD to implement the opportune innovation. BYD is a Chinese company which has the IT component manufacture, the rechargeable battery and the automobile businesses. The paper deals with the innovation methodology as the same as the IPR management BYD implements in order to obtain the rapid growth of technology development with the reasonable cost of money and time.

Keywords: Opportune innovation, vertical integration, unpatenting integration, patenting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2831
3794 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)

Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton

Abstract:

Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.

Keywords: Cold-start, expectation propagation, multi-armed bandits, Thompson sampling, variational inference.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 529
3793 The Location of Park and Ride Facilities Using the Fuzzy Inference Model

Authors: Anna Lower, Michal Lower, Robert Masztalski, Agnieszka Szumilas

Abstract:

The paper presents a method in which the expert knowledge is applied to fuzzy inference model. Even a less experienced person could benefit from the use of such a system, e.g. urban planners, officials. The analysis result is obtained in a very short time, so a large number of the proposed locations can also be verified in a short time. The proposed method is intended for testing of locations of car parks in a city. The paper shows selected examples of locations of the P&R facilities in cities planning to introduce the P&R. The analyses of existing objects are also shown in the paper and they are confronted with the opinions of the system users, with particular emphasis on unpopular locations. The results of the analyses are compared to expert analysis of the P&R facilities location that was outsourced by the city and the opinions about existing facilities users that were expressed on social networking sites. The obtained results are consistent with actual users’ feedback. The proposed method proves to be good, but does not require the involvement of a large experts team and large financial contributions for complicated research. The method also provides an opportunity to show the alternative location of P&R facilities. Although the results of the method are approximate, they are not worse than results of analysis of employed experts. The advantage of this method is ease of use, which simplifies the professional expert analysis. The ability of analyzing a large number of alternative locations gives a broader view on the problem. It is valuable that the arduous analysis of the team of people can be replaced by the model's calculation. According to the authors, the proposed method is also suitable for implementation on a GIS platform.

Keywords: Fuzzy logic inference, P&R facilities, P&R location.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1631
3792 Elasticity Model for Easing Peak Hour Demand for Metrorail Transport System

Authors: P. K. Sarkar, Amit Kumar Jain

Abstract:

The demand for Urban transportation is characterised by a large scale temporal and spatial variations which causes heavy congestion inside metro trains in peak hours near Centre Business District (CBD) of the city. The conventional approach to address peak hour congestion, metro trains has been to increase the supply by way of introduction of more trains, increasing the length of the trains, optimising the time table to increase the capacity of the system. However, there is a limitation of supply side measures determined by the design capacity of the systems beyond which any addition in the capacity requires huge capital investments. The demand side interventions are essentially required to actually spread the demand across the time and space. In this study, an attempt has been made to identify the potential Transport Demand Management tools applicable to Urban Rail Transportation systems with a special focus on differential pricing. A conceptual price elasticity model has been developed to analyse the effect of various combinations of peak and nonpeak hoursfares on demands. The elasticity values for peak hour, nonpeak hour and cross elasticity have been assumed from the relevant literature available in the field. The conceptual price elasticity model so developed is based on assumptions which need to be validated with actual values of elasticities for different segments of passengers. Once validated, the model can be used to determine the peak and nonpeak hour fares with an objective to increase overall ridership, revenue, demand levelling and optimal utilisation of assets.

Keywords: Congestion, differential pricing, elasticity, transport demand management, urban transportation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
3791 Assessing Storage of Stability and Mercury Reduction of Freeze-Dried Pseudomonas putida within Different Types of Lyoprotectant

Authors: A. A. M. Azoddein, Y. Nuratri, A. B. Bustary, F. A. M. Azli, S. C. Sayuti

Abstract:

Pseudomonas putida is a potential strain in biological treatment to remove mercury contained in the effluent of petrochemical industry due to its mercury reductase enzyme that able to reduce ionic mercury to elementary mercury. Freeze-dried P. putida allows easy, inexpensive shipping, handling and high stability of the product. This study was aimed to freeze dry P. putida cells with addition of lyoprotectant. Lyoprotectant was added into the cells suspension prior to freezing. Dried P. putida obtained was then mixed with synthetic mercury. Viability of recovery P. putida after freeze dry was significantly influenced by the type of lyoprotectant. Among the lyoprotectants, tween 80/ sucrose was found to be the best lyoprotectant. Sucrose able to recover more than 78% (6.2E+09 CFU/ml) of the original cells (7.90E+09CFU/ml) after freeze dry and able to retain 5.40E+05 viable cells after 4 weeks storage in 4oC without vacuum. Polyethylene glycol (PEG) pre-treated freeze dry cells and broth pre-treated freeze dry cells after freeze-dry recovered more than 64% (5.0 E+09CFU/ml) and >0.1% (5.60E+07CFU/ml). Freeze-dried P. putida cells in PEG and broth cannot survive after 4 weeks storage. Freeze dry also does not really change the pattern of growth P. putida but extension of lag time was found 1 hour after 3 weeks of storage. Additional time was required for freeze-dried P. putida cells to recover before introduce freeze-dried cells to more complicated condition such as mercury solution. The maximum mercury reduction of PEG pre-treated freeze-dried cells after freeze dry and after storage 3 weeks was 56.78% and 17.91%. The maximum of mercury reduction of tween 80/sucrose pre-treated freeze-dried cells after freeze dry and after storage 3 weeks were 26.35% and 25.03%. Freeze dried P. putida was found to have lower mercury reduction compare to the fresh P. putida that has been growth in agar. Result from this study may be beneficial and useful as initial reference before commercialize freeze-dried P. putida.

Keywords: Pseudomonas putida, freeze-dry, PEG, Tween80/Sucrose, mercury, cell viability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1096
3790 A Community Compromised Approach to Combinatorial Coalition Problem

Authors: Laor Boongasame, Veera Boonjing, Ho-fung Leung

Abstract:

Buyer coalition with a combination of items is a group of buyers joining together to purchase a combination of items with a larger discount. The primary aim of existing buyer coalition with a combination of items research is to generate a large total discount. However, the aim is hard to achieve because this research is based on the assumption that each buyer completely knows other buyers- information or at least one buyer knows other buyers- information in a coalition by exchange of information. These assumption contrast with the real world environment where buyers join a coalition with incomplete information, i.e., they concerned only with their expected discounts. Therefore, this paper proposes a new buyer community coalition formation with a combination of items scheme, called the Community Compromised Combinatorial Coalition scheme, under such an environment of incomplete information. In order to generate a larger total discount, after buyers who want to join a coalition propose their minimum required saving, a coalition structure that gives a maximum total retail prices is formed. Then, the total discount division of the coalition is divided among buyers in the coalition depending on their minimum required saving and is a Pareto optimal. In mathematical analysis, we compare concepts of this scheme with concepts of the existing buyer coalition scheme. Our mathematical analysis results show that the total discount of the coalition in this scheme is larger than that in the existing buyer coalition scheme.

Keywords: group decision and negotiations, group buying, gametheory, combinatorial coalition formation, Pareto optimality

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1512
3789 Statistical Modeling of Accelerated Pavement Failure Using Response Surface Methodology

Authors: Anshu Manik, Kasthurirangan Gopalakrishnan, Siddhartha K. Khaitan

Abstract:

Rutting is one of the major load-related distresses in airport flexible pavements. Rutting in paving materials develop gradually with an increasing number of load applications, usually appearing as longitudinal depressions in the wheel paths and it may be accompanied by small upheavals to the sides. Significant research has been conducted to determine the factors which affect rutting and how they can be controlled. Using the experimental design concepts, a series of tests can be conducted while varying levels of different parameters, which could be the cause for rutting in airport flexible pavements. If proper experimental design is done, the results obtained from these tests can give a better insight into the causes of rutting and the presence of interactions and synergisms among the system variables which have influence on rutting. Although traditionally, laboratory experiments are conducted in a controlled fashion to understand the statistical interaction of variables in such situations, this study is an attempt to identify the critical system variables influencing airport flexible pavement rut depth from a statistical DoE perspective using real field data from a full-scale test facility. The test results do strongly indicate that the response (rut depth) has too much noise in it and it would not allow determination of a good model. From a statistical DoE perspective, two major changes proposed for this experiment are: (1) actual replication of the tests is definitely required, (2) nuisance variables need to be identified and blocked properly. Further investigation is necessary to determine possible sources of noise in the experiment.

Keywords: Airport Pavement, Design of Experiments, Rutting, NAPTF.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1652
3788 Combination of Geological, Geophysical and Reservoir Engineering Analyses in Field Development: A Case Study

Authors: Atif Zafar, Fan Haijun

Abstract:

A sequence of different Reservoir Engineering methods and tools in reservoir characterization and field development are presented in this paper. The real data of Jin Gas Field of L-Basin of Pakistan is used. The basic concept behind this work is to enlighten the importance of well test analysis in a broader way (i.e. reservoir characterization and field development) unlike to just determine the permeability and skin parameters. Normally in the case of reservoir characterization we rely on well test analysis to some extent but for field development plan, the well test analysis has become a forgotten tool specifically for locations of new development wells. This paper describes the successful implementation of well test analysis in Jin Gas Field where the main uncertainties are identified during initial stage of field development when location of new development well was marked only on the basis of G&G (Geologic and Geophysical) data. The seismic interpretation could not encounter one of the boundary (fault, sub-seismic fault, heterogeneity) near the main and only producing well of Jin Gas Field whereas the results of the model from the well test analysis played a very crucial rule in order to propose the location of second well of the newly discovered field. The results from different methods of well test analysis of Jin Gas Field are also integrated with and supported by other tools of Reservoir Engineering i.e. Material Balance Method and Volumetric Method. In this way, a comprehensive way out and algorithm is obtained in order to integrate the well test analyses with Geological and Geophysical analyses for reservoir characterization and field development. On the strong basis of this working and algorithm, it was successfully evaluated that the proposed location of new development well was not justified and it must be somewhere else except South direction.

Keywords: Field development, reservoir characterization, reservoir engineering, well test analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1089
3787 Remote Employment: Advantages and Challenges for Egypt-s Labor Force (After the 25thJanuary Revolution)

Authors: Aya Maher

Abstract:

The growing problem of youth unemployment in Egypt after the 25th January Revolution has directed the attention of some human resource experts towards considering remote employment as a partial remedy for the unemployed youth instead of the unavailable traditional jobs, a trend which will also help with the congested offices and unsolved traffic problem in Cairo and spread a flexible work culture, but despite of that, the main issue remains unresolved for these organizations to deal with the system challenges. In the past few years, in developed countries, there has been a growing trend for many companies to shift to remote employment instead of the traditional office employment for many reasons: due to the growing technological advances that helped some employees do their work from home on a part time basis, the need for achieving an employee-s work balance in the middle of unbalanced complicated life, top management focus on employee-s productivity rather their time spent at work. The objective of this paper is to study and analyze the advantages and challenges that Egypt-s labor force will be facing in their implementation of remote or virtual employment in both government and private organizations after the 25th January revolution. Therefore, the research question will be: What are the advantages and different challenges that Egyptian organizations might face in their implementation for remote employment system and how can they manage these challenges for the system to work effectively? The study is divided into six main parts: the introduction, objective and importance of the study, research problem, methodology, experience of some countries that implemented remote employment, advantages and challenges of implementing remote employment in Egypt and then the conclusion which discuses the results and recommendations of the study.

Keywords: 25th January Revolution, Egypt, Remote Employment, Telework, Work From Home (WFH).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2501
3786 Three Computational Mathematics Techniques: Comparative Determination of Area under Curve

Authors: Khalid Pervaiz Akhter, Mahmood Ahmad, Ghulam Murtaza, Ishrat Shafi, Zafar Javed

Abstract:

The objective of this manuscript is to find area under the plasma concentration- time curve (AUC) for multiple doses of salbutamol sulphate sustained release tablets (Ventolin® oral tablets SR 8 mg, GSK, Pakistan) in the group of 18 healthy adults by using computational mathematics techniques. Following the administration of 4 doses of Ventolin® tablets 12 hourly to 24 healthy human subjects and bioanalysis of obtained plasma samples, plasma drug concentration-time profile was constructed. AUC, an important pharmacokinetic parameter, was measured using integrated equation of multiple oral dose regimens. The approximated AUC was also calculated by using computational mathematics techniques such as repeated rectangular, repeated trapezium and repeated Simpson's rule and compared with exact value of AUC calculated by using integrated equation of multiple oral dose regimens to find best computational mathematics method that gives AUC values closest to exact. The exact values of AUC for four consecutive doses of Ventolin® oral tablets were 150.5819473, 157.8131756, 164.4178231 and 162.78 ng.h/ml while the closest values approximated AUC values were 149.245962, 157.336171, 164.2585768 and 162.289224 ng.h/ml, respectively as found by repeated rectangular rule. The errors in the approximated values of AUC were negligible. It is concluded that all computational tools approximated values of AUC accurately but the repeated rectangular rule gives slightly better approximated values of AUC as compared to repeated trapezium and repeated Simpson's rules.

Keywords: Salbutamol sulphate, Area under curve (AUC), repeated rectangular rule, repeated trapezium rule, repeated Simpson's rule.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1825
3785 A Comparative Study of Rigid and Modified Simplex Methods for Optimal Parameter Settings of ACO for Noisy Non-Linear Surfaces

Authors: Seksan Chunothaisawat, Pongchanun Luangpaiboon

Abstract:

There are two common types of operational research techniques, optimisation and metaheuristic methods. The latter may be defined as a sequential process that intelligently performs the exploration and exploitation adopted by natural intelligence and strong inspiration to form several iterative searches. An aim is to effectively determine near optimal solutions in a solution space. In this work, a type of metaheuristics called Ant Colonies Optimisation, ACO, inspired by a foraging behaviour of ants was adapted to find optimal solutions of eight non-linear continuous mathematical models. Under a consideration of a solution space in a specified region on each model, sub-solutions may contain global or multiple local optimum. Moreover, the algorithm has several common parameters; number of ants, moves, and iterations, which act as the algorithm-s driver. A series of computational experiments for initialising parameters were conducted through methods of Rigid Simplex, RS, and Modified Simplex, MSM. Experimental results were analysed in terms of the best so far solutions, mean and standard deviation. Finally, they stated a recommendation of proper level settings of ACO parameters for all eight functions. These parameter settings can be applied as a guideline for future uses of ACO. This is to promote an ease of use of ACO in real industrial processes. It was found that the results obtained from MSM were pretty similar to those gained from RS. However, if these results with noise standard deviations of 1 and 3 are compared, MSM will reach optimal solutions more efficiently than RS, in terms of speed of convergence.

Keywords: Ant colony optimisation, metaheuristics, modified simplex, non-linear, rigid simplex.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1607
3784 Utilizing Ontologies Using Ontology Editor for Creating Initial Unified Modeling Language (UML)Object Model

Authors: Waralak Vongdoiwang Siricharoen

Abstract:

One of object oriented software developing problem is the difficulty of searching the appropriate and suitable objects for starting the system. In this work, ontologies appear in the part of supporting the object discovering in the initial of object oriented software developing. There are many researches try to demonstrate that there is a great potential between object model and ontologies. Constructing ontology from object model is called ontology engineering can be done; On the other hand, this research is aiming to support the idea of building object model from ontology is also promising and practical. Ontology classes are available online in any specific areas, which can be searched by semantic search engine. There are also many helping tools to do so; one of them which are used in this research is Protégé ontology editor and Visual Paradigm. To put them together give a great outcome. This research will be shown how it works efficiently with the real case study by using ontology classes in travel/tourism domain area. It needs to combine classes, properties, and relationships from more than two ontologies in order to generate the object model. In this paper presents a simple methodology framework which explains the process of discovering objects. The results show that this framework has great value while there is possible for expansion. Reusing of existing ontologies offers a much cheaper alternative than building new ones from scratch. More ontologies are becoming available on the web, and online ontologies libraries for storing and indexing ontologies are increasing in number and demand. Semantic and Ontologies search engines have also started to appear, to facilitate search and retrieval of online ontologies.

Keywords: Software Developing, Ontology, Ontology Library, Artificial Intelligent, Protégé, Object Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1860
3783 Compliance Modelling and Optimization of Kerf during WEDM of Al7075/SiCP Metal Matrix Composite

Authors: Thella Babu Rao, A. Gopala Krishna

Abstract:

This investigation presents the formulation of kerf (width of slit) and optimal control parameter settings of wire electrochemical discharge machining which results minimum possible kerf while machining Al7075/SiCp MMCs. WEDM is proved its efficiency and effectiveness to cut the hard ceramic reinforced MMCs within the permissible budget. Among the distinct performance measures of WEDM process, kerf is an important performance characteristic which determines the dimensional accuracy of the machined component while producing high precision components. The lack of available of the machinability information such advanced MMCs result the more experimentation in the manufacturing industries. Therefore, extensive experimental investigations are essential to provide the database of effect of various control parameters on the kerf while machining such advanced MMCs in WEDM. Literature reviled the significance some of the electrical parameters which are prominent on kerf for machining distinct conventional materials. However, the significance of reinforced particulate size and volume fraction on kerf is highlighted in this work while machining MMCs along with the machining parameters of pulse-on time, pulse-off time and wire tension. Usually, the dimensional tolerances of machined components are decided at the design stage and a machinist pay attention to produce the required dimensional tolerances by setting appropriate machining control variables. However, it is highly difficult to determine the optimal machining settings for such advanced materials on the shop floor. Therefore, in the view of precision of cut, kerf (cutting width) is considered as the measure of performance for the model. It was found from the literature that, the machining conditions of higher fractions of large size SiCp resulting less kerf where as high values of pulse-on time result in a high kerf. A response surface model is used to predict the relative significance of various control variables on kerf. Consequently, a powerful artificial intelligence called genetic algorithms (GA) is used to determine the best combination of the control variable settings. In the next step the conformation test was conducted for the optimal parameter settings and found good agreement between the GA kerf and measured kerf. Hence, it is clearly reveal that the effectiveness and accuracy of the developed model and program to analyze the kerf and to determine its optimal process parameters. The results obtained in this work states that, the resulted optimized parameters are capable of machining the Al7075/SiCp MMCs more efficiently and with better dimensional accuracy.

Keywords: Al7075SiCP MMC, kerf, WEDM, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1994
3782 Influence of Paralleled Capacitance Effect in Well-defined Multiple Value Logical Level System with Active Load

Authors: Chih Chin Yang, Yen Chun Lin, Hsiao Hsuan Cheng

Abstract:

Three similar negative differential resistance (NDR) profiles with both high peak to valley current density ratio (PVCDR) value and high peak current density (PCD) value in unity resonant tunneling electronic circuit (RTEC) element is developed in this paper. The PCD values and valley current density (VCD) values of the three NDR curves are all about 3.5 A and 0.8 A, respectively. All PV values of NDR curves are 0.40 V, 0.82 V, and 1.35 V, respectively. The VV values are 0.61 V, 1.07 V, and 1.69 V, respectively. All PVCDR values reach about 4.4 in three NDR curves. The PCD value of 3.5 A in triple PVCDR RTEC element is better than other resonant tunneling devices (RTD) elements. The high PVCDR value is concluded the lower VCD value about 0.8 A. The low VCD value is achieved by suitable selection of resistors in triple PVCDR RTEC element. The low PV value less than 1.35 V possesses low power dispersion in triple PVCDR RTEC element. The designed multiple value logical level (MVLL) system using triple PVCDR RTEC element provides equidistant logical level. The logical levels of MVLL system are about 0.2 V, 0.8 V, 1.5 V, and 2.2 V from low voltage to high voltage and then 2.2 V, 1.3 V, 0.8 V, and 0.2 V from high voltage back to low voltage in half cycle of sinusoid wave. The output level of four levels MVLL system is represented in 0.3 V, 1.1 V, 1.7 V, and 2.6 V, which satisfies the NMP condition of traditional two-bit system. The remarkable logical characteristic of improved MVLL system with paralleled capacitor are with four significant stable logical levels about 220 mV, 223 mV, 228 mV, and 230 mV. The stability and articulation of logical levels of improved MVLL system are outstanding. The average holding time of improved MVLL system is approximately 0.14 μs. The holding time of improved MVLL system is fourfold than of basic MVLL system. The function of additional capacitor in the improved MVLL system is successfully discovered.

Keywords: Capacitance, Logical level, Constant current source

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1366
3781 A Review on Recycled Use of Solid Wastes in Building Materials

Authors: Oriyomi M. Okeyinka, David A. Oloke, Jamal M. Khatib

Abstract:

Large quantities of solid wastes being generated worldwide from sources such as household, domestic, industrial, commercial and construction demolition activities, leads to environmental concerns. Utilization of these wastes in making building construction materials can reduce the magnitude of the associated problems. When these waste products are used in place of other conventional materials, natural resources and energy are preserved and expensive and/or potentially harmful waste disposal is avoided. Recycling which is regarded as the third most preferred waste disposal option, with its numerous environmental benefits, stand as a viable option to offset the environmental impact associated with the construction industry. This paper reviews the results of laboratory tests and important research findings, and the potential of using these wastes in building construction materials with focus on sustainable development. Research gaps, which includes; the need to develop standard mix design for solid waste based building materials; the need to develop energy efficient method of processing solid waste use in concrete; the need to study the actual behavior or performance of such building materials in practical application and the limited real life application of such building materials have also been identified. A research is being proposed to develop an environmentally friendly, lightweight building block from recycled waste paper, without the use of cement, and with properties suitable for use as walling unit. This proposed research intends to incorporate, laboratory experimentation and modeling to address the identified research gaps.

Keywords: Recycling, solid waste, construction, building materials.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7077
3780 Teaching Ethical Behaviour: Conversational Analysis in Perspective

Authors: Nikhil Kewal Krishna Mehta

Abstract:

In the past researchers have questioned the effectiveness of ethics training in higher education. Also, there are observations that support the view that ethical behaviour (range of actions)/ethical decision making models used in the past make use of vignettes to explain ethical behaviour. The understanding remains in the perspective that these vignettes play a limited role in determining individual intentions and not actions. Some authors have also agreed that there are possibilities of differences in one’s intentions and actions. This paper makes an attempt to fill those gaps by evaluating real actions rather than intentions. In a way this study suggests the use of an experiential methodology to explore Berlo’s model of communication as an action along with orchestration of various principles. To this endeavor, an attempt was made to use conversational analysis in the pursuance of evaluating ethical decision making behaviour among students and middle level managers. The process was repeated six times with the set of an average of 15 participants. Similarities have been observed in the behaviour of students and middle level managers that calls for understanding that both the groups of individuals have no cognizance of their actual actions. The deliberations derived out of conversation were taken a step forward for meta-ethical evaluations to portray a clear picture of ethical behaviour among participants. This study provides insights for understanding demonstrated unconscious human behaviour which may fortuitously be termed both ethical and unethical.

Keywords: Berlo’s action model of communication, Conversational Analysis, Ethical behaviour, Ethical decision making, experiential learning, Intentions and Actions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2525
3779 Detection of Action Potentials in the Presence of Noise Using Phase-Space Techniques

Authors: Christopher Paterson, Richard Curry, Alan Purvis, Simon Johnson

Abstract:

Emerging Bio-engineering fields such as Brain Computer Interfaces, neuroprothesis devices and modeling and simulation of neural networks have led to increased research activity in algorithms for the detection, isolation and classification of Action Potentials (AP) from noisy data trains. Current techniques in the field of 'unsupervised no-prior knowledge' biosignal processing include energy operators, wavelet detection and adaptive thresholding. These tend to bias towards larger AP waveforms, AP may be missed due to deviations in spike shape and frequency and correlated noise spectrums can cause false detection. Also, such algorithms tend to suffer from large computational expense. A new signal detection technique based upon the ideas of phasespace diagrams and trajectories is proposed based upon the use of a delayed copy of the AP to highlight discontinuities relative to background noise. This idea has been used to create algorithms that are computationally inexpensive and address the above problems. Distinct AP have been picked out and manually classified from real physiological data recorded from a cockroach. To facilitate testing of the new technique, an Auto Regressive Moving Average (ARMA) noise model has been constructed bases upon background noise of the recordings. Along with the AP classification means this model enables generation of realistic neuronal data sets at arbitrary signal to noise ratio (SNR).

Keywords: Action potential detection, Low SNR, Phase spacediagrams/trajectories, Unsupervised/no-prior knowledge.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1625
3778 Encrypter Information Software Using Chaotic Generators

Authors: Cardoza-Avendaño L., López-Gutiérrez R.M., Inzunza-González E., Cruz-Hernández C., García-Guerrero E., Spirin V., Serrano H.

Abstract:

This document shows a software that shows different chaotic generator, as continuous as discrete time. The software gives the option for obtain the different signals, using different parameters and initial condition value. The program shows then critical parameter for each model. All theses models are capable of encrypter information, this software show it too.

Keywords: cryptography, chaotic attractors, software.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1472
3777 The Survey Research and Evaluation of Green Residential Building Based on the Improved Group Analytical Hierarchy Process Method in Yinchuan

Authors: Yun-na Wu, Zhen Wang

Abstract:

Due to the economic downturn and the deterioration of the living environment, the development of residential buildings as high energy consuming building is gradually changing from “extensive” to green building in China. So, the evaluation system of green building is continuously improved, but the current evaluation work has the following problems: (1) There are differences in the cost of the actual investment and the purchasing power of residents, also construction target of green residential building is single and lacks multi-objective performance development. (2) Green building evaluation lacks regional characteristics and cannot reflect the different regional residents demand. (3) In the process of determining the criteria weight, the experts’ judgment matrix is difficult to meet the requirement of consistency. Therefore, to solve those problems, questionnaires which are about the green residential building for Ningxia area are distributed, and the results of questionnaires can feedback the purchasing power of residents and the acceptance of the green building cost. Secondly, combined with the geographical features of Ningxia minority areas, the evaluation criteria system of green residential building is constructed. Finally, using the improved group AHP method and the grey clustering method, the criteria weight is determined, and a real case is evaluated, which is located in Xing Qing district, Ningxia. A conclusion can be obtained that the professional evaluation for this project and good social recognition is basically the same.

Keywords: Evaluation, green residential building, grey clustering method, group AHP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 811
3776 Degradation of Amitriptyline Hydrochloride, Methyl Salicylate and 2-Phenoxyethanol in Water Systems by the Combination UV/Cl2

Authors: F. Javier Benitez, Francisco J. Real, Juan Luis Acero, Francisco Casas

Abstract:

Three emerging contaminants (amitriptyline hydrochloride, methyl salicylate and 2-phenoxyethanol) frequently found in waste-waters were selected to be individually degraded in ultra-pure water by the combined advanced oxidation process constituted by UV radiation and chlorine. The influence of pH, initial chlorine concentration and nature of the contaminants was firstly explored. The trend for the reactivity of the selected compounds was deduced: amitriptyline hydrochloride > methyl salicylate > 2-phenoxyethanol. A later kinetic study was carried out and focused on the specific evaluation of the first-order rate constants and the determination of the partial contribution to the global reaction of the direct photochemical pathway and the radical pathway. A comparison between the rate constant values among photochemical experiments without and with the presence of Cl2 reveals a clear increase in the oxidation efficiency of the combined process with respect to the photochemical reaction alone. In a second stage, the simultaneous oxidation of mixtures of the selected contaminants in several types of water (ultrapure water, surface water from a reservoir, and two secondary effluents) was also performed by the same combination UV/Cl2 under more realistic operating conditions. The efficiency of this combined system UV/Cl2 was compared to other oxidants such as the UV/S2O82- and UV/H2O2 AOPs. Results confirmed that the UV/Cl2 system provides higher elimination efficiencies among the AOPs tested.

Keywords: Emerging contaminants, amitriptyline, methyl salicylate, 2-phenoxyethanol, chlorination, photolysis, rate constants, UV/chlorine advanced oxidation process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1522
3775 Relationship between Trauma and Acute Scrotum: Test Torsion and Epididymal Appendix Torsion

Authors: Saimir Heta, Kastriot Haxhirexha, Virtut Velmishi, Nevila Alliu, Ilma Robo

Abstract:

Background: Testicular rotation can occur at any age. The possibility to save the testicle is the fastest possible surgical intervention which is indicated by the presence of acute pain even at rest. The time element is more important to diagnose and proceed further with surgical intervention. Testicular damage is a consequence which mainly depends on the moment of onset of symptoms, at the time when the symptoms are diagnosed, the earliest action to be performed is surgical intervention. Sometimes medical tests are needed to confirm a diagnosis, or to help identify another cause for symptoms; for example, the urine test, that is used to check for infection, associated with the scrotal ultrasound test. Control of blood flow to the longitudinal supply vessels of the testicles is indicated. The sign that indicates testicular rotation is a reduction in blood flow. This is the element which is distinguished from ultrasound examination. Surgery may be needed to determine if the patient’s symptoms are caused by the rotation of the testis or any other condition. Discussion: As a surgical intervention of the emergency, the torsion of the test depends very much on the duration of the torsion, as the success in the life of the testicle depends on the fastest surgical intervention. From the previous clinic, it is noted that in any case presented to the pediatric patient diagnosed with testicular rotation, there is always a link with personal history that the patient refers to the presence of a previous episode of testicular trauma. Literature supports this fact very logically. Conclusions: Salvation without testicular atrophy depends closely on establishing the diagnosis of testicular rotation as soon as possible. Following the logic above, it can be said that the diagnosis for rotation should be performed as soon as possible, to avoid consequences that will not be favorable for the patient.

Keywords: Acute scrotum, testicular torsion, newborns, infants, clinical presentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 475
3774 Measurements of MRI R2* Relaxation Rate in Liver and Muscle: Animal Model

Authors: Chiung-Yun Chang, Po-Chou Chen, Jiun-Shiang Tzeng, Ka-Wai Mac, Chia-Chi Hsiao, Jo-Chi Jao

Abstract:

This study was aimed to measure effective transverse relaxation rates (R2*) in the liver and muscle of normal New Zealand White (NZW) rabbits. R2* relaxation rate has been widely used in various hepatic diseases for iron overload by quantifying iron contents in liver. R2* relaxation rate is defined as the reciprocal of T2* relaxation time and mainly depends on the constituents of tissue. Different tissues would have different R2* relaxation rates. The signal intensity decay in Magnetic resonance imaging (MRI) may be characterized by R2* relaxation rates. In this study, a 1.5T GE Signa HDxt whole body MR scanner equipped with an 8-channel high resolution knee coil was used to observe R2* values in NZW rabbit’s liver and muscle. Eight healthy NZW rabbits weighted 2 ~ 2.5 kg were recruited. After anesthesia using Zoletil 50 and Rompun 2% mixture, the abdomen of rabbit was landmarked at the center of knee coil to perform 3-plane localizer scan using fast spoiled gradient echo (FSPGR) pulse sequence. Afterwards, multi-planar fast gradient echo (MFGR) scans were performed with 8 various echo times (TEs) to acquire images for R2* measurements. Regions of interest (ROIs) at liver and muscle were measured using Advantage workstation. Finally, the R2* was obtained by a linear regression of ln(sı) on TE. The results showed that the longer the echo time, the smaller the signal intensity. The R2* values of liver and muscle were 44.8 ± 10.9 s-1 and 37.4 ± 9.5 s-1, respectively. It implies that the iron concentration of liver is higher than that of muscle. In conclusion, the more the iron contents in tissue, the higher the R2*. The correlations between R2* and iron content in NZW rabbits might be valuable for further exploration.

Keywords: Liver, MRI, multi-planar fast gradient echo, muscle, R2* relaxation rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2127
3773 The Application of Hadamard Matrixes in the SNR Enhancement of Optical Time-Domain Reflectometry(OTDR)

Authors: Mingyu Zhong, Yi Xie

Abstract:

Results in one field necessarily give insight into the others, and all have much potential for scientific and technological application. The Hadamard-transform technique once been applied to the spectrometry also has its use in the SNR Enhancement of OTDR. In this report, a new set of code (Simplex-codes) is discussed and where the addition gain of SNR come from is implied.

Keywords: Hadamard-transform, matrixes, averaging, opticaltime-domain reflectometry (OTDR).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1292
3772 Space Telemetry Anomaly Detection Based on Statistical PCA Algorithm

Authors: B. Nassar, W. Hussein, M. Mokhtar

Abstract:

The critical concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission, but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the problem above coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions, and the results show that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.

Keywords: Space telemetry monitoring, multivariate analysis, PCA algorithm, space operations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2041
3771 A Novel Approach to Allocate Channels Dynamically in Wireless Mesh Networks

Authors: Y. Harold Robinson, M. Rajaram

Abstract:

Wireless mesh networking is rapidly gaining in popularity with a variety of users: from municipalities to enterprises, from telecom service providers to public safety and military organizations. This increasing popularity is based on two basic facts: ease of deployment and increase in network capacity expressed in bandwidth per footage; WMNs do not rely on any fixed infrastructure. Many efforts have been used to maximizing throughput of the network in a multi-channel multi-radio wireless mesh network. Current approaches are purely based on either static or dynamic channel allocation approaches. In this paper, we use a hybrid multichannel multi radio wireless mesh networking architecture, where static and dynamic interfaces are built in the nodes. Dynamic Adaptive Channel Allocation protocol (DACA), it considers optimization for both throughput and delay in the channel allocation. The assignment of the channel has been allocated to be codependent with the routing problem in the wireless mesh network and that should be based on passage flow on every link. Temporal and spatial relationship rises to re compute the channel assignment every time when the pattern changes in mesh network, channel assignment algorithms assign channels in network. In this paper a computing path which captures the available path bandwidth is the proposed information and the proficient routing protocol based on the new path which provides both static and dynamic links. The consistency property guarantees that each node makes an appropriate packet forwarding decision and balancing the control usage of the network, so that a data packet will traverse through the right path.

Keywords: Wireless mesh network, spatial time division multiple access, hybrid topology, timeslot allocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1819
3770 Evaluation of Short-Term Load Forecasting Techniques Applied for Smart Micro Grids

Authors: Xiaolei Hu, Enrico Ferrera, Riccardo Tomasi, Claudio Pastrone

Abstract:

Load Forecasting plays a key role in making today's and future's Smart Energy Grids sustainable and reliable. Accurate power consumption prediction allows utilities to organize in advance their resources or to execute Demand Response strategies more effectively, which enables several features such as higher sustainability, better quality of service, and affordable electricity tariffs. It is easy yet effective to apply Load Forecasting at larger geographic scale, i.e. Smart Micro Grids, wherein the lower available grid flexibility makes accurate prediction more critical in Demand Response applications. This paper analyses the application of short-term load forecasting in a concrete scenario, proposed within the EU-funded GreenCom project, which collect load data from single loads and households belonging to a Smart Micro Grid. Three short-term load forecasting techniques, i.e. linear regression, artificial neural networks, and radial basis function network, are considered, compared, and evaluated through absolute forecast errors and training time. The influence of weather conditions in Load Forecasting is also evaluated. A new definition of Gain is introduced in this paper, which innovatively serves as an indicator of short-term prediction capabilities of time spam consistency. Two models, 24- and 1-hour-ahead forecasting, are built to comprehensively compare these three techniques.

Keywords: Short-term load forecasting, smart micro grid, linear regression, artificial neural networks, radial basis function network, Gain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2582
3769 Hybrid Equity Warrants Pricing Formulation under Stochastic Dynamics

Authors: Teh Raihana Nazirah Roslan, Siti Zulaiha Ibrahim, Sharmila Karim

Abstract:

A warrant is a financial contract that confers the right but not the obligation, to buy or sell a security at a certain price before expiration. The standard procedure to value equity warrants using call option pricing models such as the Black–Scholes model had been proven to contain many flaws, such as the assumption of constant interest rate and constant volatility. In fact, existing alternative models were found focusing more on demonstrating techniques for pricing, rather than empirical testing. Therefore, a mathematical model for pricing and analyzing equity warrants which comprises stochastic interest rate and stochastic volatility is essential to incorporate the dynamic relationships between the identified variables and illustrate the real market. Here, the aim is to develop dynamic pricing formulations for hybrid equity warrants by incorporating stochastic interest rates from the Cox-Ingersoll-Ross (CIR) model, along with stochastic volatility from the Heston model. The development of the model involves the derivations of stochastic differential equations that govern the model dynamics. The resulting equations which involve Cauchy problem and heat equations are then solved using partial differential equation approaches. The analytical pricing formulas obtained in this study comply with the form of analytical expressions embedded in the Black-Scholes model and other existing pricing models for equity warrants. This facilitates the practicality of this proposed formula for comparison purposes and further empirical study.

Keywords: Cox-Ingersoll-Ross model, equity warrants, Heston model, hybrid models, stochastic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 551
3768 Electrical Characterization and Reliability Analysis of HfO2-TiO2-Al MOSCAPs

Authors: Shibesh Dutta, Sivaramakrishnan R., Sundar Gopalan, Balakrishnan Shankar

Abstract:

MOSCAPs of various combinations of Hafnium oxide and Titanium oxide of varying thickness with Aluminum as gate electrode have been fabricated and electrically characterized. The effects of voltage stress on the I-V characteristics for prolonged time durations have been studied and compared. Results showed hard breakdown and negligible degradation of reliability under stress.

Keywords: breakdown, MOSCAP, voltage stress.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1422
3767 Development of a Catchment Water Quality Model for Continuous Simulations of Pollutants Build-up and Wash-off

Authors: Iqbal Hossain, Dr. Monzur Imteaz, Dr. Shirley Gato-Trinidad, Prof. Abdallah Shanableh

Abstract:

Estimation of runoff water quality parameters is required to determine appropriate water quality management options. Various models are used to estimate runoff water quality parameters. However, most models provide event-based estimates of water quality parameters for specific sites. The work presented in this paper describes the development of a model that continuously simulates the accumulation and wash-off of water quality pollutants in a catchment. The model allows estimation of pollutants build-up during dry periods and pollutants wash-off during storm events. The model was developed by integrating two individual models; rainfall-runoff model, and catchment water quality model. The rainfall-runoff model is based on the time-area runoff estimation method. The model allows users to estimate the time of concentration using a range of established methods. The model also allows estimation of the continuing runoff losses using any of the available estimation methods (i.e., constant, linearly varying or exponentially varying). Pollutants build-up in a catchment was represented by one of three pre-defined functions; power, exponential, or saturation. Similarly, pollutants wash-off was represented by one of three different functions; power, rating-curve, or exponential. The developed runoff water quality model was set-up to simulate the build-up and wash-off of total suspended solids (TSS), total phosphorus (TP) and total nitrogen (TN). The application of the model was demonstrated using available runoff and TSS field data from road and roof surfaces in the Gold Coast, Australia. The model provided excellent representation of the field data demonstrating the simplicity yet effectiveness of the proposed model.

Keywords: Catchment, continuous pollutants build-up, pollutants wash-off, runoff, runoff water quality model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3099
3766 A Robust Method for Finding Nearest-Neighbor using Hexagon Cells

Authors: Ahmad Attiq Al-Ogaibi, Ahmad Sharieh, Moh’d Belal Al-Zoubi, R. Bremananth

Abstract:

In pattern clustering, nearest neighborhood point computation is a challenging issue for many applications in the area of research such as Remote Sensing, Computer Vision, Pattern Recognition and Statistical Imaging. Nearest neighborhood computation is an essential computation for providing sufficient classification among the volume of pixels (voxels) in order to localize the active-region-of-interests (AROI). Furthermore, it is needed to compute spatial metric relationships of diverse area of imaging based on the applications of pattern recognition. In this paper, we propose a new methodology for finding the nearest neighbor point, depending on making a virtually grid of a hexagon cells, then locate every point beneath them. An algorithm is suggested for minimizing the computation and increasing the turnaround time of the process. The nearest neighbor query points Φ are fetched by seeking fashion of hexagon holistic. Seeking will be repeated until an AROI Φ is to be expected. If any point Υ is located then searching starts in the nearest hexagons in a circular way. The First hexagon is considered be level 0 (L0) and the surrounded hexagons is level 1 (L1). If Υ is located in L1, then search starts in the next level (L2) to ensure that Υ is the nearest neighbor for Φ. Based on the result and experimental results, we found that the proposed method has an advantage over the traditional methods in terms of minimizing the time complexity required for searching the neighbors, in turn, efficiency of classification will be improved sufficiently.

Keywords: Hexagon cells, k-nearest neighbors, Nearest Neighbor, Pattern recognition, Query pattern, Virtually grid

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2773