Search results for: requirement analysis
26947 Urban Analysis of the Old City of Oran and Its Building after an Earthquake
Authors: A. Zatir, A. Mokhtari, A. Foufa, S. Zatir
Abstract:
The city of Oran, like any other region of northern Algeria, is subject to frequent seismic activity, the study presented in this work will be based on an analysis of urban and architectural context of the city of Oran before the date of the earthquake of 1790, and then try to deduce the differences between the old city before and after the earthquake. The analysis developed as a specific objective to tap into the seismic history of the city of Oran parallel to its urban history. The example of the citadel of Oran indicates that constructions presenting the site of the old citadel, may present elements of resistance for face to seismic effects. Removed in city observations of these structures, showed the ingenuity of the techniques used by the ancient builders, including the good performance of domes and arches in resistance to seismic forces.Keywords: earthquake, citadel, performance, traditional techniques, constructions
Procedia PDF Downloads 30526946 Swot Analysis for Employment of Graduates of Physical Education and Sport Sciences in Iran
Authors: Mohammad Reza Boroumand Devlagh
Abstract:
Employment problem, especially university graduates is the most important challenges in the decade ahead. The purpose of this study is the SWOT analysis for employment of graduates of Physical Education and Sport Sciences in Iran. The sample of this research consist of 115 (35.5 + 8.0 years) of physical education and sport sciences faculty members of higher education institutions, major sport managers and graduates of physical education and sport sciences. Library method, interview and questioners were used to collect data. The questionnaires were made in four parts: Strengths, Weaknesses, Opportunities and Threats with Cronbach's alpha coefficient of 0.94. After data collection, means, standard deviation (SD) and percentage were calculated by using SPSS software. Fridman was used for the statical analysis at P < 0.05. The results showed that Employment of graduates of Physical Education and Sport Sciences in Iran Located In the worst position possible (T-W area) in Strategic Position and Action Evaluation Matrix) SPACEM), and there are more weaknesses than strengths (2.02 < 2.5) in internal evaluation and there are more threats than opportunities(2.36 < 2.5) in external evaluation.Keywords: employment, graduate, physical education and sport sciences, SWOT analysis
Procedia PDF Downloads 53926945 The Comparison of Joint Simulation and Estimation Methods for the Geometallurgical Modeling
Authors: Farzaneh Khorram
Abstract:
This paper endeavors to construct a block model to assess grinding energy consumption (CCE) and pinpoint blocks with the highest potential for energy usage during the grinding process within a specified region. Leveraging geostatistical techniques, particularly joint estimation, or simulation, based on geometallurgical data from various mineral processing stages, our objective is to forecast CCE across the study area. The dataset encompasses variables obtained from 2754 drill samples and a block model comprising 4680 blocks. The initial analysis encompassed exploratory data examination, variography, multivariate analysis, and the delineation of geological and structural units. Subsequent analysis involved the assessment of contacts between these units and the estimation of CCE via cokriging, considering its correlation with SPI. The selection of blocks exhibiting maximum CCE holds paramount importance for cost estimation, production planning, and risk mitigation. The study conducted exploratory data analysis on lithology, rock type, and failure variables, revealing seamless boundaries between geometallurgical units. Simulation methods, such as Plurigaussian and Turning band, demonstrated more realistic outcomes compared to cokriging, owing to the inherent characteristics of geometallurgical data and the limitations of kriging methods.Keywords: geometallurgy, multivariate analysis, plurigaussian, turning band method, cokriging
Procedia PDF Downloads 7026944 Failure Cases Analysis in Petrochemical Industry
Authors: S. W. Liu, J. H. Lv, W. Z. Wang
Abstract:
In recent years, the failure accidents in petrochemical industry have been frequent, and have posed great security problems in personnel and property. The improvement of petrochemical safety is highly requested in order to prevent re-occurrence of severe accident. This study focuses on surveying the failure cases occurred in petrochemical field, which were extracted from journals of engineering failure, including engineering failure analysis and case studies in engineering failure analysis. The relation of failure mode, failure mechanism, type of components, and type of materials was analyzed in this study. And the analytical results showed that failures occurred more frequently in vessels and piping among the petrochemical equipment. Moreover, equipment made of carbon steel and stainless steel accounts for the majority of failures compared to other materials. This may be related to the application of the equipment and the performance of the material. In addition, corrosion failures were the largest in number of occurrence in the failure of petrochemical equipment, in which stress corrosion cracking accounts for a large proportion. This may have a lot to do with the service environment of the petrochemical equipment. Therefore, it can be concluded that the corrosion prevention of petrochemical equipment is particularly important.Keywords: cases analysis, corrosion, failure, petrochemical industry
Procedia PDF Downloads 30726943 A Thorough Analysis on The Dialog Application Replika
Authors: Weeam Abdulrahman, Gawaher Al-Madwary, Fatima Al-Ammari, Razan Mohammad
Abstract:
This research discusses the AI features in Replika which is a dialog with a customized characters application, interaction and communication with AI in different ways that is provided for the user. spreading a survey with questions on how the AI worked is one approach of exposing the app to others to utilize and also we made an analysis that provides us with the conclusion of our research as a result, individuals will be able to try out the app. In the methodology we explain each page that pops up in the screen while using replika and Specify each part and icon.Keywords: Replika, AI, artificial intelligence, dialog app
Procedia PDF Downloads 17726942 Comprehensive Experimental Study to Determine Energy Dissipation of Nappe Flows on Stepped Chutes
Authors: Abdollah Ghasempour, Mohammad Reza Kavianpour, Majid Galoie
Abstract:
This study has investigated the fundamental parameters which have effective role on energy dissipation of nappe flows on stepped chutes in order to estimate an empirical relationship using dimensional analysis. To gain this goal, comprehensive experimental study on some large-scale physical models with various step geometries, slopes, discharges, etc. were carried out. For all models, hydraulic parameters such as velocity, pressure, water depth, flow regime and etc. were measured precisely. The effective parameters, then, could be determined by analysis of experimental data. Finally, a dimensional analysis was done in order to estimate an empirical relationship for evaluation of energy dissipation of nappe flows on stepped chutes. Because of using the large-scale physical models in this study, the empirical relationship is in very good agreement with the experimental results.Keywords: nappe flow, energy dissipation, stepped chute, dimensional analysis
Procedia PDF Downloads 36126941 Behavioral Analysis of Stock Using Selective Indicators from Fundamental and Technical Analysis
Authors: Vish Putcha, Chandrasekhar Putcha, Siva Hari
Abstract:
In the current digital era of free trading and pandemic-driven remote work culture, markets worldwide gained momentum for retail investors to trade from anywhere easily. The number of retail traders rose to 24% of the market from 15% at the pre-pandemic level. Most of them are young retail traders with high-risk tolerance compared to the previous generation of retail traders. This trend boosted the growth of subscription-based market predictors and market data vendors. Young traders are betting on these predictors, assuming one of them is correct. However, 90% of retail traders are on the losing end. This paper presents multiple indicators and attempts to derive behavioral patterns from the underlying stocks. The two major indicators that traders and investors follow are technical and fundamental. The famous investor, Warren Buffett, adheres to the “Value Investing” method that is based on a stock’s fundamental Analysis. In this paper, we present multiple indicators from various methods to understand the behavior patterns of stocks. For this research, we picked five stocks with a market capitalization of more than $200M, listed on the exchange for more than 20 years, and from different industry sectors. To study the behavioral pattern over time for these five stocks, a total of 8 indicators are chosen from fundamental, technical, and financial indicators, such as Price to Earning (P/E), Price to Book Value (P/B), Debt to Equity (D/E), Beta, Volatility, Relative Strength Index (RSI), Moving Averages and Dividend yields, followed by detailed mathematical Analysis. This is an interdisciplinary paper between various disciplines of Engineering, Accounting, and Finance. The research takes a new approach to identify clear indicators affecting stocks. Statistical Analysis of the data will be performed in terms of the probabilistic distribution, then follow and then determine the probability of the stock price going over a specific target value. The Chi-square test will be used to determine the validity of the assumed distribution. Preliminary results indicate that this approach is working well. When the complete results are presented in the final paper, they will be beneficial to the community.Keywords: stock pattern, stock market analysis, stock predictions, trading, investing, fundamental analysis, technical analysis, quantitative trading, financial analysis, behavioral analysis
Procedia PDF Downloads 8526940 Quantitative Assessment of Soft Tissues by Statistical Analysis of Ultrasound Backscattered Signals
Authors: Da-Ming Huang, Ya-Ting Tsai, Shyh-Hau Wang
Abstract:
Ultrasound signals backscattered from the soft tissues are mainly depending on the size, density, distribution, and other elastic properties of scatterers in the interrogated sample volume. The quantitative analysis of ultrasonic backscattering is frequently implemented using the statistical approach due to that of backscattering signals tends to be with the nature of the random variable. Thus, the statistical analysis, such as Nakagami statistics, has been applied to characterize the density and distribution of scatterers of a sample. Yet, the accuracy of statistical analysis could be readily affected by the receiving signals associated with the nature of incident ultrasound wave and acoustical properties of samples. Thus, in the present study, efforts were made to explore such effects as the ultrasound operational modes and attenuation of biological tissue on the estimation of corresponding Nakagami statistical parameter (m parameter). In vitro measurements were performed from healthy and pathological fibrosis porcine livers using different single-element ultrasound transducers and duty cycles of incident tone burst ranging respectively from 3.5 to 7.5 MHz and 10 to 50%. Results demonstrated that the estimated m parameter tends to be sensitively affected by the use of ultrasound operational modes as well as the tissue attenuation. The healthy and pathological tissues may be characterized quantitatively by m parameter under fixed measurement conditions and proper calibration.Keywords: ultrasound backscattering, statistical analysis, operational mode, attenuation
Procedia PDF Downloads 32326939 Comparison of Extended Kalman Filter and Unscented Kalman Filter for Autonomous Orbit Determination of Lagrangian Navigation Constellation
Authors: Youtao Gao, Bingyu Jin, Tanran Zhao, Bo Xu
Abstract:
The history of satellite navigation can be dated back to the 1960s. From the U.S. Transit system and the Russian Tsikada system to the modern Global Positioning System (GPS) and the Globalnaya Navigatsionnaya Sputnikovaya Sistema (GLONASS), performance of satellite navigation has been greatly improved. Nowadays, the navigation accuracy and coverage of these existing systems have already fully fulfilled the requirement of near-Earth users, but these systems are still beyond the reach of deep space targets. Due to the renewed interest in space exploration, a novel high-precision satellite navigation system is becoming even more important. The increasing demand for such a deep space navigation system has contributed to the emergence of a variety of new constellation architectures, such as the Lunar Global Positioning System. Apart from a Walker constellation which is similar to the one adopted by GPS on Earth, a novel constellation architecture which consists of libration point satellites in the Earth-Moon system is also available to construct the lunar navigation system, which can be called accordingly, the libration point satellite navigation system. The concept of using Earth-Moon libration point satellites for lunar navigation was first proposed by Farquhar and then followed by many other researchers. Moreover, due to the special characteristics of Libration point orbits, an autonomous orbit determination technique, which is called ‘Liaison navigation’, can be adopted by the libration point satellites. Using only scalar satellite-to-satellite tracking data, both the orbits of the user and libration point satellites can be determined autonomously. In this way, the extensive Earth-based tracking measurement can be eliminated, and an autonomous satellite navigation system can be developed for future space exploration missions. The method of state estimate is an unnegligible factor which impacts on the orbit determination accuracy besides type of orbit, initial state accuracy and measurement accuracy. We apply the extended Kalman filter(EKF) and the unscented Kalman filter(UKF) to determinate the orbits of Lagrangian navigation satellites. The autonomous orbit determination errors are compared. The simulation results illustrate that UKF can improve the accuracy and z-axis convergence to some extent.Keywords: extended Kalman filter, autonomous orbit determination, unscented Kalman filter, navigation constellation
Procedia PDF Downloads 28526938 Dynamic Analysis of Differential Systems with Infinite Memory and Damping
Authors: Kun-Peng Jin, Jin Liang, Ti-Jun Xiao
Abstract:
In this work, we are concerned with the dynamic behaviors of solutions to some coupled systems with infinite memory, which consist of two partial differential equations where only one partial differential equation has damping. Such coupled systems are good mathematical models to describe the deformation and stress characteristics of some viscoelastic materials affected by temperature change, external forces, and other factors. By using the theory of operator semigroups, we give wellposedness results for the Cauchy problem for these coupled systems. Then, with the help of some auxiliary functions and lemmas, which are specially designed for overcoming difficulties in the proof, we show that the solutions of the coupled systems decay to zero in a strong way under a few basic conditions. The results in this dynamic analysis of coupled systems are generalizations of many existing results.Keywords: dynamic analysis, coupled system, infinite memory, damping.
Procedia PDF Downloads 22126937 To Investigate a Discharge Planning Connect with Long Term Care 2.0 Program in a Medical Center in Taiwan
Authors: Chan Hui-Ya, Ding Shin-Tan
Abstract:
Background and Aim: The discharge planning is considered helpful to reduce the hospital length of stay and readmission rate, and then increased satisfaction with healthcare for patients and professionals. In order to decrease the waiting time of long-term care and boost the care quality of patients after discharge from the hospital, the Ministry of Health and Welfare department in Taiwan initiates a program “discharge planning connects with long-term care 2.0 services” in 2017. The purpose of this study is to investigate the outcome of the pilot of this program in a medical center. Methods: By purpose sampling, the study chose five wards in a medical center as pilot units. The researchers compared the beds of service, the numbers of cases which were transferred to the long-term care center and transferred rates per month between the pilot units and the other units, and analyze the basic data, the long-term care service needs and the approval service items of cases transfer to the long-term care center in pilot units. Results: From June to September 2017, a total of 92 referrals were made, and 51 patients were enrolled into the pilot program. There is a significant difference of transferring rate between the pilot units and the other units (χ = 702.6683, p < 0.001). Only 20 cases (39.2% success rate) were approved to accept the parts of service items of long-term care in the pilot units. The most approval item was respite care service (n = 13; 65%), while it was third at needs ranking of service lists during linking services process. Among the reasons of patients who cancelled the request, 38.71% reasons were related to the services which could not match the patients’ needs and expectation. Conclusion: The results indicate there is a requirement to modify the long-term care services to fit the needs of cases. The researchers suggest estimating the potential cases by screening data from hospital informatics systems and to hire more case manager according the service time of potential cases. Meanwhile, the strategies shortened the assessment scale and authorized hospital case managers to approve some items of long-term care should be considered.Keywords: discharge planning, long-term care, case manager, patient care
Procedia PDF Downloads 28626936 Multi-Source Data Fusion for Urban Comprehensive Management
Authors: Bolin Hua
Abstract:
In city governance, various data are involved, including city component data, demographic data, housing data and all kinds of business data. These data reflects different aspects of people, events and activities. Data generated from various systems are different in form and data source are different because they may come from different sectors. In order to reflect one or several facets of an event or rule, data from multiple sources need fusion together. Data from different sources using different ways of collection raised several issues which need to be resolved. Problem of data fusion include data update and synchronization, data exchange and sharing, file parsing and entry, duplicate data and its comparison, resource catalogue construction. Governments adopt statistical analysis, time series analysis, extrapolation, monitoring analysis, value mining, scenario prediction in order to achieve pattern discovery, law verification, root cause analysis and public opinion monitoring. The result of Multi-source data fusion is to form a uniform central database, which includes people data, location data, object data, and institution data, business data and space data. We need to use meta data to be referred to and read when application needs to access, manipulate and display the data. A uniform meta data management ensures effectiveness and consistency of data in the process of data exchange, data modeling, data cleansing, data loading, data storing, data analysis, data search and data delivery.Keywords: multi-source data fusion, urban comprehensive management, information fusion, government data
Procedia PDF Downloads 39326935 The Impact of Cognitive Load on Deceit Detection and Memory Recall in Children’s Interviews: A Meta-Analysis
Authors: Sevilay Çankaya
Abstract:
The detection of deception in children’s interviews is essential for statement veracity. The widely used method for deception detection is building cognitive load, which is the logic of the cognitive interview (CI), and its effectiveness for adults is approved. This meta-analysis delves into the effectiveness of inducing cognitive load as a means of enhancing veracity detection during interviews with children. Additionally, the effectiveness of cognitive load on children's total number of events recalled is assessed as a second part of the analysis. The current meta-analysis includes ten effect sizes from search using databases. For the effect size calculation, Hedge’s g was used with a random effect model by using CMA version 2. Heterogeneity analysis was conducted to detect potential moderators. The overall result indicated that cognitive load had no significant effect on veracity outcomes (g =0.052, 95% CI [-.006,1.25]). However, a high level of heterogeneity was found (I² = 92%). Age, participants’ characteristics, interview setting, and characteristics of the interviewer were coded as possible moderators to explain variance. Age was significant moderator (β = .021; p = .03, R2 = 75%) but the analysis did not reveal statistically significant effects for other potential moderators: participants’ characteristics (Q = 0.106, df = 1, p = .744), interview setting (Q = 2.04, df = 1, p = .154), and characteristics of interviewer (Q = 2.96, df = 1, p = .086). For the second outcome, the total number of events recalled, the overall effect was significant (g =4.121, 95% CI [2.256,5.985]). The cognitive load was effective in total recalled events when interviewing with children. All in all, while age plays a crucial role in determining the impact of cognitive load on veracity, the surrounding context, interviewer attributes, and inherent participant traits may not significantly alter the relationship. These findings throw light on the need for more focused, age-specific methods when using cognitive load measures. It may be possible to improve the precision and dependability of deceit detection in children's interviews with the help of more studies in this field.Keywords: deceit detection, cognitive load, memory recall, children interviews, meta-analysis
Procedia PDF Downloads 5526934 Youths’ Analysis and Evaluation of Characters’ Behavior: A Case Study of a Stage Play, Kaki, at Faculty of Liberal Arts, Prince of Songkhla University
Authors: Montri Meenium
Abstract:
The purpose of this research was to examine youths’ analysis and evaluation of three protagonists, one female and two males involved in sexual relationship in the stage play “Kaki” held by the Faculty of Liberal Arts, Prince of Songkla University. The interviews were conducted with 10 youths in the production team and 10 audience youths, totalling 20. The findings, which were presented in the form of a descriptive analysis, showed that all the 10 youths in the production team and the 10 audience youths did not accept the behaviour of the protagonists: the female who committed adultery and the males who were corrupted by power, had sexual relationship with a married woman and deceived people. The youths, however, knew that such behaviour resulted from being overpowered by human passion, especially infatuation, which was in accordance with the theme of the play. It was suggested that the story twines ideology or points of view that defy moral and ethics, prompting questions to be asked. Hence, the stage play can be used as an instrument to develop critical thinking in youths.Keywords: descriptive analysis, protagonists, youths, stage-play
Procedia PDF Downloads 25326933 Identifying the Determinants of the Shariah Non-Compliance Risk via Principal Axis Factoring
Authors: Muhammad Arzim Naim, Saiful Azhar Rosly, Mohamad Sahari Nordin
Abstract:
The objective of this study is to investigate the factors affecting the rise of Shariah non-compliance risk that can bring Islamic banks to succumb to monetary loss. Prior literatures have never analyzed such risk in details despite lots of it arguing on the validity of some Shariah compliance products. The Shariah non-compliance risk in this context is looking to the potentially failure of the facility to stand from the court test say that if the banks bring it to the court for compensation from the defaulted clients. The risk may also arise if the customers refuse to make the financing payments on the grounds of the validity of the contracts, for example, when relinquishing critical requirement of Islamic contract such as ownership, the risk that may lead the banks to suffer loss when the customer invalidate the contract through the court. The impact of Shariah non-compliance risk to Islamic banks is similar to that of legal risks faced by the conventional banks. Both resulted into monetary losses to the banks respectively. In conventional banking environment, losses can be in the forms of summons paid to the customers if they won the case. In banking environment, this normally can be in very huge amount. However, it is right to mention that for Islamic banks, the subsequent impact to them can be rigorously big because it will affect their reputation. If the customers do not perceive them to be Shariah compliant, they will take their money and bank it in other places. This paper provides new insights of risks faced by credit intensive Islamic banks by providing a new extension of knowledge with regards to the Shariah non-compliance risk by identifying its individual components that directly affecting the risk together with empirical evidences. Not limited to the Islamic banking fraternities, the regulators and policy makers should be able to use findings in this paper to evaluate the components of the Shariah non-compliance risk and make the necessary actions. The paper is written based on Malaysia’s Islamic banking practices which may not directly related to other jurisdictions. Even though the focuses of this study is directly towards to the Bay Bithaman Ajil or popularly known as BBA (i.e. sale with deferred payments) financing modality, the result from this study may be applicable to other Islamic financing vehicles.Keywords: Islamic banking, Islamic finance, Shariah Non-compliance risk, Bay Bithaman Ajil (BBA), principal axis factoring
Procedia PDF Downloads 30226932 Towards Efficient Reasoning about Families of Class Diagrams Using Union Models
Authors: Tejush Badal, Sanaa Alwidian
Abstract:
Class diagrams are useful tools within the Unified Modelling Language (UML) to model and visualize the relationships between, and properties of objects within a system. As a system evolves over time and space (e.g., products), a series of models with several commonalities and variabilities create what is known as a model family. In circumstances where there are several versions of a model, examining each model individually, becomes expensive in terms of computation resources. To avoid performing redundant operations, this paper proposes an approach for representing a family of class diagrams into Union Models to represent model families using a single generic model. The paper aims to analyze and reason about a family of class diagrams using union models as opposed to individual analysis of each member model in the family. The union algorithm provides a holistic view of the model family, where the latter cannot be otherwise obtained from an individual analysis approach, this in turn, enhances the analysis performed in terms of speeding up the time needed to analyze a family of models together as opposed to analyzing individual models, one model at a time.Keywords: analysis, class diagram, model family, unified modeling language, union model
Procedia PDF Downloads 7426931 A Nexus between Financial Development and Its Determinants: A Panel Data Analysis from a Global Perspective
Authors: Bilal Ashraf, Qianxiao Zhang
Abstract:
This study empirically investigated the linkage amid financial development and its important determinants such as information and communication technology, natural resource rents, economic growth, current account balance, and gross savings in 107 economies. This paper preferred to employ the second-generation unit root tests to handle the issues of slope heterogeneity and “cross-sectional dependence” in panel data. The “Kao, Pedroni, and Westerlund tests” confirm the long-lasting connections among the variables under study, while the significant endings of “cross-sectionally augmented autoregressive distributed lag (CS-ARDL)” exposed that NRR, CAB, and S negatively affected the financial development while ICT and EG stimulates the procedure of FD. Further, the robustness analysis's application of FGLS supports the appropriateness and applicability of CS-ARDL. Finally, the findings of “DH causality analysis” endorse the bidirectional causality linkages amongst research factors. Based on the study's outcomes, we suggest some policy suggestions that empower the process of financial development, globally.Keywords: determinants of financial developments, CS-ARDL, financial development, global sample, causality analysis
Procedia PDF Downloads 6026930 A Bibliometric Analysis of the Structural Equation Modeling in Education
Authors: Lim Yi Wei
Abstract:
Structural equation modelling (SEM) is well-known in statistics due to its flexibility and accessibility. It plays an increasingly important role in the development of the education field. The number of research publications using SEM in education has increased in recent decades. However, there is a lack of scientific review conducted on SEM in education. The purpose of this study is to investigate research trends related to SEM in education. The researcher will use Vosviewer, Datawrapper, and SciMAT to do bibliometric analysis on 5549 papers that have been published in the Scopus database in the last five years. The result will show the publication trends of the most cited documents, the top contributing authors, countries, institutions, and journals in the research field. It will also look at how they relate to each other in terms of co-citation, collaboration, and co-occurrence of keywords. This study will benefit researchers and practitioners by identifying research trends and the current state of SEM in education.Keywords: structural equation modeling, education, bibliometric analysis, Vosviewer
Procedia PDF Downloads 10026929 The LMPA/Epoxy Mixture Encapsulation of OLED on Polyimide Substrate
Authors: Chuyi Ye, Minsang Kim, Cheol-Hee Moon
Abstract:
The organic light emitting diode(OLED), is a potential organic optical functional materials which is considered as the next generation display technology with the advantages such as all-solid state, ultra-thin thickness, active luminous and flexibility. Due to the development of polymer-inorganic substrate, it becomes possible to achieve the flexible OLED display. However the organic light-emitting material is very sensitive to the oxygen and water vapor, and the encapsulation requires water vapor transmission rate(WVTR) and oxygen transmission rate(OTR) as lower as 10-6 g/(m2.d) and 10-5 cm3/(m2.d) respectively. In current situation, the rigorous WVTR and OTR have restricted the application of the OLED display. Traditional epoxy/getter or glass frit approaches, which have been widely applied on glass-substrate-based devices, are not suitable for transparent flexible organic devices, and mechanically flexible thin-film approaches are required. To ensure the OLED’s lifetime, the encapsulation material of the OLED package is very important. In this paper, a low melting point alloy(LMPA)-epoxy mixture in the encapsulation process is introduced. There will be a phase separation when the mixture is heated to the melting of LMPA and the formation of the double line structure between two substrates: the alloy barrier has extremely low WVTR and OTR and the epoxy fills the potential tiny cracks. In our experiment, the PI film is chosen as a flexible transparent substrate, and Mo and Cu are deposited on the PI film successively. Then the two metal layers are photolithographied to the sealing pattern line. The Mo is a transition layer between the PI film and Cu, at the same time, the Cu has a good wettability with the LMPA(Sn-58Bi). At last, pattern is printed with LMPA layer and applied voltage, the gathering Joule heat melt the LMPA and form the double line structure and the OLED package is sealed in the same time. In this research, the double-line encapsulating structure of LMPA and epoxy on the PI film is manufactured for the flexible OLED encapsulation, and in this process it is investigated whether the encapsulation satisfies the requirement of WVTR and OTR for the flexible OLED.Keywords: encapsulation, flexible, low melting point alloy, OLED
Procedia PDF Downloads 59926928 Design Optimization of Miniature Mechanical Drive Systems Using Tolerance Analysis Approach
Authors: Eric Mxolisi Mkhondo
Abstract:
Geometrical deviations and interaction of mechanical parts influences the performance of miniature systems.These deviations tend to cause costly problems during assembly due to imperfections of components, which are invisible to a naked eye.They also tend to cause unsatisfactory performance during operation due to deformation cause by environmental conditions.One of the effective tools to manage the deviations and interaction of parts in the system is tolerance analysis.This is a quantitative tool for predicting the tolerance variations which are defined during the design process.Traditional tolerance analysis assumes that the assembly is static and the deviations come from the manufacturing discrepancies, overlooking the functionality of the whole system and deformation of parts due to effect of environmental conditions. This paper presents an integrated tolerance analysis approach for miniature system in operation.In this approach, a computer-aided design (CAD) model is developed from system’s specification.The CAD model is then used to specify the geometrical and dimensional tolerance limits (upper and lower limits) that vary component’s geometries and sizes while conforming to functional requirements.Worst-case tolerances are analyzed to determine the influenced of dimensional changes due to effects of operating temperatures.The method is used to evaluate the nominal conditions, and worse case conditions in maximum and minimum dimensions of assembled components.These three conditions will be evaluated under specific operating temperatures (-40°C,-18°C, 4°C, 26°C, 48°C, and 70°C). A case study on the mechanism of a zoom lens system is used to illustrate the effectiveness of the methodology.Keywords: geometric dimensioning, tolerance analysis, worst-case analysis, zoom lens mechanism
Procedia PDF Downloads 16526927 Governance Challenges for the Management of Water Resources in Agriculture: The Italian Way
Authors: Silvia Baralla, Raffaella Zucaro, Romina Lorenzetti
Abstract:
Water management needs to cope with economic, societal, and environmental changes. This could be guaranteed through 'shifting from government to governance'. In the last decades, it was applied in Europe through and within important legislative pillars (Water Framework Directive and Common Agricultural Policy) and their measures focused on resilience and adaptation to climate change, with particular attention to the creation of synergies among policies and all the actors involved at different levels. Within the climate change context, the agricultural sector can play, through sustainable water management, a leading role for climate-resilient growth and environmental integrity. A recent analysis on the water management governance of different countries identified some common gaps dealing with administrative, policy, information, capacity building, funding, objective, and accountability. The ability of a country to fill these gaps is an essential requirement to make some of the changes requested by Europe, in particular the improvement of the agro-ecosystem resilience to the effect of climatic change, supporting green and digital transitions, and sustainable water use. This research aims to contribute in sharing examples of water governances and related advantages useful to fill the highlighted gaps. Italy has developed a strong and exhaustive model of water governance in order to react with strategic and synergic actions since it is one of the European countries most threatened by climate change and its extreme events (drought, floods). In particular, the Italian water governance model was able to overcome several gaps, specifically as concerns the water use in agriculture, adopting strategies as a systemic/integrated approach, the stakeholder engagement, capacity building, the improvement of planning and monitoring ability, and an adaptive/resilient strategy for funding activities. They were carried out, putting in place regulatory, structural, and management actions. Regulatory actions include both the institution of technical committees grouping together water decision-makers and the elaboration of operative manuals and guidelines by means of a participative and cross-cutting approach. Structural actions deal with the funding of interventions within European and national funds according to the principles of coherence and complementarity. Finally, management actions regard the introduction of operational tools to support decision-makers in order to improve planning and monitoring ability. In particular, two cross-functional and interoperable web databases were introduced: SIGRIAN (National Information System for Water Resources Management in Agriculture) and DANIA (National Database of Investments for Irrigation and the Environment). Their interconnection allows to support sustainable investments, taking into account the compliance about irrigation volumes quantified in SIGRIAN, ensuring a high level of attention on water saving, and monitoring the efficiency of funding. Main positive results from the Italian water governance model deal with a synergic and coordinated work at the national, regional, and local level among institutions, the transparency on water use in agriculture, a deeper understanding from the stakeholder side of the importance of their roles and of their own potential benefits and the capacity to guarantee continuity to this model, through a sensitization process and the combined use of management operational tools.Keywords: agricultural sustainability, governance model, water management, water policies
Procedia PDF Downloads 11726926 Optimal Feature Extraction Dimension in Finger Vein Recognition Using Kernel Principal Component Analysis
Authors: Amir Hajian, Sepehr Damavandinejadmonfared
Abstract:
In this paper the issue of dimensionality reduction is investigated in finger vein recognition systems using kernel Principal Component Analysis (KPCA). One aspect of KPCA is to find the most appropriate kernel function on finger vein recognition as there are several kernel functions which can be used within PCA-based algorithms. In this paper, however, another side of PCA-based algorithms -particularly KPCA- is investigated. The aspect of dimension of feature vector in PCA-based algorithms is of importance especially when it comes to the real-world applications and usage of such algorithms. It means that a fixed dimension of feature vector has to be set to reduce the dimension of the input and output data and extract the features from them. Then a classifier is performed to classify the data and make the final decision. We analyze KPCA (Polynomial, Gaussian, and Laplacian) in details in this paper and investigate the optimal feature extraction dimension in finger vein recognition using KPCA.Keywords: biometrics, finger vein recognition, principal component analysis (PCA), kernel principal component analysis (KPCA)
Procedia PDF Downloads 36526925 Analysis of Bank Characteristics in a Hydrogen Refueling Station
Authors: Bo Hyun Kim, Sarng Woo Karng
Abstract:
In constructing a hydrogen refueling station, minimizing the volume and reducing the number of banks enable lessening the construction cost. This study aims at performing the dynamic simulation on 250 kg/day of a refueling station for light-duty vehicles. The primary compressor boosts hydrogen from a tube trailer of 250 to 480 bar and stores it in a medium-pressure bank. Then, additional compression of hydrogen from 480 to 900 bar is carried out and stored in a high-pressure bank. Economic analysis was conducted considering the amount of electricity consumed by compression corresponding to the volume and the number of banks (cascade system) in charging mode. NIST REFPROP was selected as the equation of state on the ASPEN HYSYS for thermodynamic analysis of the tube-trailer, the compressors, the chillers, and the banks. Compared to a single high-pressure bank system of 3000 L, the volume of the cascade high-pressure banks (bank1: 250 L and bank 2: 1850 L) was reduced by 30%, and the power consumption of the chiller for precooling was also decreased by 16%.Keywords: light-duty vehicles, economic analysis, cascade system, hydrogen refueling station
Procedia PDF Downloads 9326924 Classifying and Analysis 8-Bit to 8-Bit S-Boxes Characteristic Using S-Box Evaluation Characteristic
Authors: Muhammad Luqman, Yusuf Kurniawan
Abstract:
S-Boxes is one of the linear parts of the cryptographic algorithm. The existence of S-Box in the cryptographic algorithm is needed to maintain non-linearity of the algorithm. Nowadays, modern cryptographic algorithms use an S-Box as a part of algorithm process. Despite the fact that several cryptographic algorithms today reuse theoretically secure and carefully constructed S-Boxes, there is an evaluation characteristic that can measure security properties of S-Boxes and hence the corresponding primitives. Analysis of an S-Box usually is done using manual mathematics calculation. Several S-Boxes are presented as a Truth Table without any mathematical background algorithm. Then, it’s rather difficult to determine the strength of Truth Table S-Box without a mathematical algorithm. A comprehensive analysis should be applied to the Truth Table S-Box to determine the characteristic. Several important characteristics should be owned by the S-Boxes, they are Nonlinearity, Balancedness, Algebraic degree, LAT, DAT, differential delta uniformity, correlation immunity and global avalanche criterion. Then, a comprehensive tool will be present to automatically calculate the characteristics of S-Boxes and determine the strength of S-Box. Comprehensive analysis is done on a deterministic process to produce a sequence of S-Boxes characteristic and give advice for a better S-Box construction.Keywords: cryptographic properties, Truth Table S-Boxes, S-Boxes characteristic, deterministic process
Procedia PDF Downloads 36326923 Evalutaion of the Surface Water Quality Using the Water Quality Index and Discriminant Analysis Method
Authors: Lazhar Belkhiri, Ammar Tiri, Lotfi Mouni
Abstract:
Water resources present to the public order of the world a very important problem for the protection and management of water quality given the complexity of water quality data sets. In this study, the water quality index (WQI) and irrigation water quality index (IWQI) were calculated in order to evaluate the surface water quality for drinking and irrigation purposes based on nine hydrochemical parameters. In order to separate the variables that are the most responsible for the spatial differentiation, the discriminant analysis (DA) was applied. The results show that the surface water quality for drinking is poor quality and very poor quality based on WQI values, however, the values of IWQI reflect that this water is acceptable for irrigation with a restriction for sensitive plants. Consequently, the discriminant analysis DA method has shown that the following parameters pH, potassium, chloride, sulfate, and bicarbonate are significant discrimination between the different stations with the spatial variation of the surface water quality, therefore, the results obtained in this study provide very useful information to decision-makersKeywords: surface water quality, drinking and irrigation purposes, water quality index, discriminant analysis
Procedia PDF Downloads 8626922 The Strengths and Limitations of the Statistical Modeling of Complex Social Phenomenon: Focusing on SEM, Path Analysis, or Multiple Regression Models
Authors: Jihye Jeon
Abstract:
This paper analyzes the conceptual framework of three statistical methods, multiple regression, path analysis, and structural equation models. When establishing research model of the statistical modeling of complex social phenomenon, it is important to know the strengths and limitations of three statistical models. This study explored the character, strength, and limitation of each modeling and suggested some strategies for accurate explaining or predicting the causal relationships among variables. Especially, on the studying of depression or mental health, the common mistakes of research modeling were discussed.Keywords: multiple regression, path analysis, structural equation models, statistical modeling, social and psychological phenomenon
Procedia PDF Downloads 65326921 Quantification of Site Nonlinearity Based on HHT Analysis of Seismic Recordings
Authors: Ruichong Zhang
Abstract:
This study proposes a recording-based approach to characterize and quantify earthquake-induced site nonlinearity, exemplified as soil nonlinearity and/or liquefaction. Alternative to Fourier spectral analysis (FSA), the paper introduces time-frequency analysis of earthquake ground motion recordings with the aid of so-called Hilbert-Huang transform (HHT), and offers justification for the HHT in addressing the nonlinear features shown in the recordings. With the use of the 2001 Nisqually earthquake recordings, this study shows that the proposed approach is effective in characterizing site nonlinearity and quantifying the influences in seismic ground responses.Keywords: site nonlinearity, site amplification, site damping, Hilbert-Huang Transform (HHT), liquefaction, 2001 Nisqually Earthquake
Procedia PDF Downloads 48726920 Quantitative Structure Activity Relationship and Insilco Docking of Substituted 1,3,4-Oxadiazole Derivatives as Potential Glucosamine-6-Phosphate Synthase Inhibitors
Authors: Suman Bala, Sunil Kamboj, Vipin Saini
Abstract:
Quantitative Structure Activity Relationship (QSAR) analysis has been developed to relate antifungal activity of novel substituted 1,3,4-oxadiazole against Candida albicans and Aspergillus niger using computer assisted multiple regression analysis. The study has shown the better relationship between antifungal activities with respect to various descriptors established by multiple regression analysis. The analysis has shown statistically significant correlation with R2 values 0.932 and 0.782 against Candida albicans and Aspergillus niger respectively. These derivatives were further subjected to molecular docking studies to investigate the interactions between the target compounds and amino acid residues present in the active site of glucosamine-6-phosphate synthase. All the synthesized compounds have better docking score as compared to standard fluconazole. Our results could be used for the further design as well as development of optimal and potential antifungal agents.Keywords: 1, 3, 4-oxadiazole, QSAR, multiple linear regression, docking, glucosamine-6-phosphate synthase
Procedia PDF Downloads 34126919 A Study on Characteristics of Hedonic Price Models in Korea Based on Meta-Regression Analysis
Authors: Minseo Jo
Abstract:
The purpose of this paper is to examine the factors in the hedonic price models, that has significance impact in determining the price of apartments. There are many variables employed in the hedonic price models and their effectiveness vary differently according to the researchers and the regions they are analysing. In order to consider various conditions, the meta-regression analysis has been selected for the study. In this paper, four meta-independent variables, from the 65 hedonic price models to analysis. The factors that influence the prices of apartments, as well as including factors that influence the prices of apartments, regions, which are divided into two of the research performed, years of research performed, the coefficients of the functions employed. The covariance between the four meta-variables and p-value of the coefficients and the four meta-variables and number of data used in the 65 hedonic price models have been analyzed in this study. The six factors that are most important in deciding the prices of apartments are positioning of apartments, the noise of the apartments, points of the compass and views from the apartments, proximity to the public transportations, companies that have constructed the apartments, social environments (such as schools etc.).Keywords: hedonic price model, housing price, meta-regression analysis, characteristics
Procedia PDF Downloads 40226918 Assessment of Slope Stability by Continuum and Discontinuum Methods
Authors: Taleb Hosni Abderrahmane, Berga Abdelmadjid
Abstract:
The development of numerical analysis and its application to geomechanics problems have provided geotechnical engineers with extremely powerful tools. One of the most important problems in geotechnical engineering is the slope stability assessment. It is a very difficult task due to several aspects such the nature of the problem, experimental consideration, monitoring, controlling, and assessment. The main objective of this paper is to perform a comparative numerical study between the following methods: The Limit Equilibrium (LEM), Finite Element (FEM), Limit Analysis (LAM) and Distinct Element (DEM). The comparison is conducted in terms of the safety factors and the critical slip surfaces. Through the results, we see the feasibility to analyse slope stability by many methods.Keywords: comparison, factor of safety, geomechanics, numerical methods, slope analysis, slip surfaces
Procedia PDF Downloads 533