Search results for: simple random sampling
5017 The Estimation Method of Stress Distribution for Beam Structures Using the Terrestrial Laser Scanning
Authors: Sang Wook Park, Jun Su Park, Byung Kwan Oh, Yousok Kim, Hyo Seon Park
Abstract:
This study suggests the estimation method of stress distribution for the beam structures based on TLS (Terrestrial Laser Scanning). The main components of method are the creation of the lattices of raw data from TLS to satisfy the suitable condition and application of CSSI (Cubic Smoothing Spline Interpolation) for estimating stress distribution. Estimation of stress distribution for the structural member or the whole structure is one of the important factors for safety evaluation of the structure. Existing sensors which include ESG (Electric strain gauge) and LVDT (Linear Variable Differential Transformer) can be categorized as contact type sensor which should be installed on the structural members and also there are various limitations such as the need of separate space where the network cables are installed and the difficulty of access for sensor installation in real buildings. To overcome these problems inherent in the contact type sensors, TLS system of LiDAR (light detection and ranging), which can measure the displacement of a target in a long range without the influence of surrounding environment and also get the whole shape of the structure, has been applied to the field of structural health monitoring. The important characteristic of TLS measuring is a formation of point clouds which has many points including the local coordinate. Point clouds is not linear distribution but dispersed shape. Thus, to analyze point clouds, the interpolation is needed vitally. Through formation of averaged lattices and CSSI for the raw data, the method which can estimate the displacement of simple beam was developed. Also, the developed method can be extended to calculate the strain and finally applicable to estimate a stress distribution of a structural member. To verify the validity of the method, the loading test on a simple beam was conducted and TLS measured it. Through a comparison of the estimated stress and reference stress, the validity of the method is confirmed.Keywords: structural healthcare monitoring, terrestrial laser scanning, estimation of stress distribution, coordinate transformation, cubic smoothing spline interpolation
Procedia PDF Downloads 4335016 An Engineer-Oriented Life Cycle Assessment Tool for Building Carbon Footprint: The Building Carbon Footprint Evaluation System in Taiwan
Authors: Hsien-Te Lin
Abstract:
The purpose of this paper is to introduce the BCFES (building carbon footprint evaluation system), which is a LCA (life cycle assessment) tool developed by the Low Carbon Building Alliance (LCBA) in Taiwan. A qualified BCFES for the building industry should fulfill the function of evaluating carbon footprint throughout all stages in the life cycle of building projects, including the production, transportation and manufacturing of materials, construction, daily energy usage, renovation and demolition. However, many existing BCFESs are too complicated and not very designer-friendly, creating obstacles in the implementation of carbon reduction policies. One of the greatest obstacle is the misapplication of the carbon footprint inventory standards of PAS2050 or ISO14067, which are designed for mass-produced goods rather than building projects. When these product-oriented rules are applied to building projects, one must compute a tremendous amount of data for raw materials and the transportation of construction equipment throughout the construction period based on purchasing lists and construction logs. This verification method is very cumbersome by nature and unhelpful to the promotion of low carbon design. With a view to provide an engineer-oriented BCFE with pre-diagnosis functions, a component input/output (I/O) database system and a scenario simulation method for building energy are proposed herein. Most existing BCFESs base their calculations on a product-oriented carbon database for raw materials like cement, steel, glass, and wood. However, data on raw materials is meaningless for the purpose of encouraging carbon reduction design without a feedback mechanism, because an engineering project is not designed based on raw materials but rather on building components, such as flooring, walls, roofs, ceilings, roads or cabinets. The LCBA Database has been composited from existing carbon footprint databases for raw materials and architectural graphic standards. Project designers can now use the LCBA Database to conduct low carbon design in a much more simple and efficient way. Daily energy usage throughout a building's life cycle, including air conditioning, lighting, and electric equipment, is very difficult for the building designer to predict. A good BCFES should provide a simplified and designer-friendly method to overcome this obstacle in predicting energy consumption. In this paper, the author has developed a simplified tool, the dynamic Energy Use Intensity (EUI) method, to accurately predict energy usage with simple multiplications and additions using EUI data and the designed efficiency levels for the building envelope, AC, lighting and electrical equipment. Remarkably simple to use, it can help designers pre-diagnose hotspots in building carbon footprint and further enhance low carbon designs. The BCFES-LCBA offers the advantages of an engineer-friendly component I/O database, simplified energy prediction methods, pre-diagnosis of carbon hotspots and sensitivity to good low carbon designs, making it an increasingly popular carbon management tool in Taiwan. To date, about thirty projects have been awarded BCFES-LCBA certification and the assessment has become mandatory in some cities.Keywords: building carbon footprint, life cycle assessment, energy use intensity, building energy
Procedia PDF Downloads 1395015 The French Ekang Ethnographic Dictionary. The Quantum Approach
Authors: Henda Gnakate Biba, Ndassa Mouafon Issa
Abstract:
Dictionaries modeled on the Western model [tonic accent languages] are not suitable and do not account for tonal languages phonologically, which is why the [prosodic and phonological] ethnographic dictionary was designed. It is a glossary that expresses the tones and the rhythm of words. It recreates exactly the speaking or singing of a tonal language, and allows the non-speaker of this language to pronounce the words as if they were a native. It is a dictionary adapted to tonal languages. It was built from ethnomusicological theorems and phonological processes, according to Jean. J. Rousseau 1776 hypothesis /To say and to sing were once the same thing/. Each word in the French dictionary finds its corresponding language, ekaη. And each word ekaη is written on a musical staff. This ethnographic dictionary is also an inventive, original and innovative research thesis, but it is also an inventive, original and innovative research thesis. A contribution to the theoretical, musicological, ethno musicological and linguistic conceptualization of languages, giving rise to the practice of interlocution between the social and cognitive sciences, the activities of artistic creation and the question of modeling in the human sciences: mathematics, computer science, translation automation and artificial intelligence. When you apply this theory to any text of a folksong of a world-tone language, you do not only piece together the exact melody, rhythm, and harmonies of that song as if you knew it in advance but also the exact speaking of this language. The author believes that the issue of the disappearance of tonal languages and their preservation has been structurally resolved, as well as one of the greatest cultural equations related to the composition and creation of tonal, polytonal and random music. The experimentation confirming the theorization designed a semi-digital, semi-analog application which translates the tonal languages of Africa (about 2,100 languages) into blues, jazz, world music, polyphonic music, tonal and anatonal music and deterministic and random music). To test this application, I use a music reading and writing software that allows me to collect the data extracted from my mother tongue, which is already modeled in the musical staves saved in the ethnographic (semiotic) dictionary for automatic translation ( volume 2 of the book). Translation is done (from writing to writing, from writing to speech and from writing to music). Mode of operation: you type a text on your computer, a structured song (chorus-verse), and you command the machine a melody of blues, jazz and, world music or, variety etc. The software runs, giving you the option to choose harmonies, and then you select your melody.Keywords: music, language, entenglement, science, research
Procedia PDF Downloads 695014 Secure E-Pay System Using Steganography and Visual Cryptography
Authors: K. Suganya Devi, P. Srinivasan, M. P. Vaishnave, G. Arutperumjothi
Abstract:
Today’s internet world is highly prone to various online attacks, of which the most harmful attack is phishing. The attackers host the fake websites which are very similar and look alike. We propose an image based authentication using steganography and visual cryptography to prevent phishing. This paper presents a secure steganographic technique for true color (RGB) images and uses Discrete Cosine Transform to compress the images. The proposed method hides the secret data inside the cover image. The use of visual cryptography is to preserve the privacy of an image by decomposing the original image into two shares. Original image can be identified only when both qualified shares are simultaneously available. Individual share does not reveal the identity of the original image. Thus, the existence of the secret message is hard to be detected by the RS steganalysis.Keywords: image security, random LSB, steganography, visual cryptography
Procedia PDF Downloads 3305013 Adaptive Conjoint Analysis of Professionals’ Job Preferences
Authors: N. Scheidegger, A. Mueller
Abstract:
Job preferences are a well-developed research field. Many studies analyze the preferences using simple ratings with a sample of university graduates. The current study analyzes the preferences with a mixed method approach of a qualitative preliminary study and adaptive conjoint-analysis. Preconditions of accepting job offers are clarified for professionals in the industrial sector. It could be shown that, e.g. wages above the average are critical and that career opportunities must be seen broader than merely a focus on formal personnel development programs. The results suggest that, to be effective with their recruitment efforts, employers must take into account key desirable job attributes of their target group.Keywords: conjoint analysis, employer attractiveness, job preferences, personnel marketing
Procedia PDF Downloads 1995012 Early Impact Prediction and Key Factors Study of Artificial Intelligence Patents: A Method Based on LightGBM and Interpretable Machine Learning
Authors: Xingyu Gao, Qiang Wu
Abstract:
Patents play a crucial role in protecting innovation and intellectual property. Early prediction of the impact of artificial intelligence (AI) patents helps researchers and companies allocate resources and make better decisions. Understanding the key factors that influence patent impact can assist researchers in gaining a better understanding of the evolution of AI technology and innovation trends. Therefore, identifying highly impactful patents early and providing support for them holds immeasurable value in accelerating technological progress, reducing research and development costs, and mitigating market positioning risks. Despite the extensive research on AI patents, accurately predicting their early impact remains a challenge. Traditional methods often consider only single factors or simple combinations, failing to comprehensively and accurately reflect the actual impact of patents. This paper utilized the artificial intelligence patent database from the United States Patent and Trademark Office and the Len.org patent retrieval platform to obtain specific information on 35,708 AI patents. Using six machine learning models, namely Multiple Linear Regression, Random Forest Regression, XGBoost Regression, LightGBM Regression, Support Vector Machine Regression, and K-Nearest Neighbors Regression, and using early indicators of patents as features, the paper comprehensively predicted the impact of patents from three aspects: technical, social, and economic. These aspects include the technical leadership of patents, the number of citations they receive, and their shared value. The SHAP (Shapley Additive exPlanations) metric was used to explain the predictions of the best model, quantifying the contribution of each feature to the model's predictions. The experimental results on the AI patent dataset indicate that, for all three target variables, LightGBM regression shows the best predictive performance. Specifically, patent novelty has the greatest impact on predicting the technical impact of patents and has a positive effect. Additionally, the number of owners, the number of backward citations, and the number of independent claims are all crucial and have a positive influence on predicting technical impact. In predicting the social impact of patents, the number of applicants is considered the most critical input variable, but it has a negative impact on social impact. At the same time, the number of independent claims, the number of owners, and the number of backward citations are also important predictive factors, and they have a positive effect on social impact. For predicting the economic impact of patents, the number of independent claims is considered the most important factor and has a positive impact on economic impact. The number of owners, the number of sibling countries or regions, and the size of the extended patent family also have a positive influence on economic impact. The study primarily relies on data from the United States Patent and Trademark Office for artificial intelligence patents. Future research could consider more comprehensive data sources, including artificial intelligence patent data, from a global perspective. While the study takes into account various factors, there may still be other important features not considered. In the future, factors such as patent implementation and market applications may be considered as they could have an impact on the influence of patents.Keywords: patent influence, interpretable machine learning, predictive models, SHAP
Procedia PDF Downloads 505011 Denoising of Magnetotelluric Signals by Filtering
Authors: Rodrigo Montufar-Chaveznava, Fernando Brambila-Paz, Ivette Caldelas
Abstract:
In this paper, we present the advances corresponding to the denoising processing of magnetotelluric signals using several filters. In particular, we use the most common spatial domain filters such as median and mean, but we are also using the Fourier and wavelet transform for frequency domain filtering. We employ three datasets obtained at the different sampling rate (128, 4096 and 8192 bps) and evaluate the mean square error, signal-to-noise relation, and peak signal-to-noise relation to compare the kernels and determine the most suitable for each case. The magnetotelluric signals correspond to earth exploration when water is searched. The object is to find a denoising strategy different to the one included in the commercial equipment that is employed in this task.Keywords: denoising, filtering, magnetotelluric signals, wavelet transform
Procedia PDF Downloads 3705010 A Study of Anthropometric Correlation between Upper and Lower Limb Dimensions in Sudanese Population
Authors: Altayeb Abdalla Ahmed
Abstract:
Skeletal phenotype is a product of a balanced interaction between genetics and environmental factors throughout different life stages. Therefore, interlimb proportions are variable between populations. Although interlimb proportion indices have been used in anthropology in assessing the influence of various environmental factors on limbs, an extensive literature review revealed that there is a paucity of published research assessing interlimb part correlations and possibility of reconstruction. Hence, this study aims to assess the relationships between upper and lower limb parts and develop regression formulae to reconstruct the parts from one another. The left upper arm length, ulnar length, wrist breadth, hand length, hand breadth, tibial length, bimalleolar breadth, foot length, and foot breadth of 376 right-handed subjects, comprising 187 males and 189 females (aged 25-35 years), were measured. Initially, the data were analyzed using basic univariate analysis and independent t-tests; then sex-specific simple and multiple linear regression models were used to estimate upper limb parts from lower limb parts and vice-versa. The results of this study indicated significant sexual dimorphism for all variables. The results indicated a significant correlation between the upper and lower limbs parts (p < 0.01). Linear and multiple (stepwise) regression equations were developed to reconstruct the limb parts in the presence of a single or multiple dimension(s) from the other limb. Multiple stepwise regression equations generated better reconstructions than simple equations. These results are significant in forensics as it can aid in identification of multiple isolated limb parts particularly during mass disasters and criminal dismemberment. Although a DNA analysis is the most reliable tool for identification, its usage has multiple limitations in undeveloped countries, e.g., cost, facility availability, and trained personnel. Furthermore, it has important implication in plastic and orthopedic reconstructive surgeries. This study is the only reported study assessing the correlation and prediction capabilities between many of the upper and lower dimensions. The present study demonstrates a significant correlation between the interlimb parts in both sexes, which indicates a possibility to reconstruction using regression equations.Keywords: anthropometry, correlation, limb, Sudanese
Procedia PDF Downloads 2955009 The Environmental and Socio Economic Impacts of Mining on Local Livelihood in Cameroon: A Case Study in Bertoua
Authors: Fongang Robert Tichuck
Abstract:
This paper reports the findings of a study undertaken to assess the socio-economic and environmental impacts of mining in Bertoua Eastern Region of Cameroon. In addition to sampling community perceptions of mining activities, the study prescribes interventions that can assist in mitigating the negative impacts of mining. Marked environmental and interrelated socio-economic improvements can be achieved within regional artisanal gold mines if the government provides technical support to local operators, regulations are improved, and illegal mining activity is reduced.Keywords: gold mining, socio-economic, mining activities, local people
Procedia PDF Downloads 3965008 Relevance of Copyright and Trademark in the Gaming Industry
Authors: Deeksha Karunakar
Abstract:
The gaming industry is one of the biggest industries in the world. Video games are interactive works of authorship that require the execution of a computer programme on specialized hardware but which also incorporate a wide variety of other artistic mediums, such as music, scripts, stories, video, paintings, and characters, into which the player takes an active role. Therefore, video games are not made as singular, simple works but rather as a collection of elements that, if they reach a certain level of originality and creativity, can each be copyrighted on their own. A video game is made up of a wide variety of parts, all of which combine to form the overall sensation that we, the players, have while playing. The entirety of the components is implemented in the form of software code, which is then translated into the game's user interface. Even while copyright protection is already in place for the coding of software, the work that is produced because of that coding can also be protected by copyright. This includes the game's storyline or narrative, its characters, and even elements of the code on their own. In each sector, there is a potential legal framework required, and the gaming industry also requires legal frameworks. This represents the importance of intellectual property laws in each sector. This paper will explore the beginnings of video games, the various aspects of game copyrights, and the approach of the courts, including examples of a few different instances. Although the creative arts have always been known to draw inspiration from and build upon the works of others, it has not always been simple to evaluate whether a game has been cloned. The video game business is experiencing growth as it has never seen before today. The majority of today's video games are both pieces of software and works of audio-visual art. Even though the existing legal framework does not have a clause specifically addressing video games, it is clear that there is a great many alternative means by which this protection can be granted. This paper will represent the importance of copyright and trademark laws in the gaming industry and its regulations with the help of relevant case laws via utilizing doctrinal methodology to support its findings. The aim of the paper is to make aware of the applicability of intellectual property laws in the gaming industry and how the justice system is evolving to adapt to such new industries. Furthermore, it will provide in-depth knowledge of their relationship with each other.Keywords: copyright, DMCA, gaming industry, trademark, WIPO
Procedia PDF Downloads 695007 Cognitive Performance and Physiological Stress during an Expedition in Antarctica
Authors: Andrée-Anne Parent, Alain-Steve Comtois
Abstract:
The Antarctica environment can be a great challenge for human exploration. Explorers need to be focused on the task and require the physical abilities to succeed and survive in complete autonomy in this hostile environment. The aim of this study was to observe cognitive performance and physiological stress with a biomarker (cortisol) and hand grip strength during an expedition in Antarctica. A total of 6 explorers were in complete autonomous exploration on the Forbidden Plateau in Antarctica to reach unknown summits during a 30 day period. The Stroop Test, a simple reaction time, and mood scale (PANAS) tests were performed every week during the expedition. Saliva samples were taken before sailing to Antarctica, the first day on the continent, after the mission on the continent and on the boat return trip. Furthermore, hair samples were taken before and after the expedition. The results were analyzed with SPSS using ANOVA repeated measures. The Stroop and mood scale results are presented in the following order: 1) before sailing to Antarctica, 2) the first day on the continent, 3) after the mission on the continent and 4) on the boat return trip. No significant difference was observed with the Stroop (759±166 ms, 850±114 ms, 772±179 ms and 833±105 ms, respectively) and the PANAS (39.5 ±5.7, 40.5±5, 41.8±6.9, 37.3±5.8 positive emotions, and 17.5±2.3, 18.2±5, 18.3±8.6, 15.8±5.4 negative emotions, respectively) (p>0.05). However, there appears to be an improvement at the end of the second week. Furthermore, the simple reaction time was significantly lower at the end of the second week, a moment where important decisions were taken about the mission, vs the week before (416±39 ms vs 459.8±39 ms respectively; p=0.030). Furthermore, the saliva cortisol was not significantly different (p>0.05) possibly due to important variations and seemed to reach a peak on the first day on the continent. However, the cortisol from the hair pre and post expedition increased significantly (2.4±0.5 pg/mg pre-expedition and 16.7±9.2 pg/mg post-expedition, p=0.013) showing important stress during the expedition. Moreover, no significant difference was observed on the grip strength except between after the mission on the continent and after the boat return trip (91.5±21 kg vs 85±19 kg, p=0.20). In conclusion, the cognitive performance does not seem to be affected during the expedition. Furthermore, it seems to increase for specific important events where the crew seemed to focus on the present task. The physiological stress does not seem to change significantly at specific moments, however, a global pre-post mission measure can be important and for this reason, for long-term missions, a pre-expedition baseline measure is important for crewmembers.Keywords: Antarctica, cognitive performance, expedition, physiological adaptation, reaction time
Procedia PDF Downloads 2435006 The Experience with SiC MOSFET and Buck Converter Snubber Design
Authors: Petr Vaculik
Abstract:
The newest semiconductor devices on the market are MOSFET transistors based on the silicon carbide – SiC. This material has exclusive features thanks to which it becomes a better switch than Si – silicon semiconductor switch. There are some special features that need to be understood to enable the device’s use to its full potential. The advantages and differences of SiC MOSFETs in comparison with Si IGBT transistors have been described in first part of this article. Second part describes driver for SiC MOSFET transistor and last part of article represents SiC MOSFET in the application of buck converter (step-down) and design of simple RC snubber.Keywords: SiC, Si, MOSFET, IGBT, SBD, RC snubber
Procedia PDF Downloads 4845005 The Optical OFDM Equalization Based on the Fractional Fourier Transform
Authors: A. Cherifi, B. S. Bouazza, A. O. Dahman, B. Yagoubi
Abstract:
Transmission over Optical channels will introduce inter-symbol interference (ISI) as well as inter-channel (or inter-carrier) interference (ICI). To decrease the effects of ICI, this paper proposes equalizer for the Optical OFDM system based on the fractional Fourier transform (FrFFT). In this FrFT-OFDM system, traditional Fourier transform is replaced by fractional Fourier transform to modulate and demodulate the data symbols. The equalizer proposed consists of sampling the received signal in the different time per time symbol. Theoretical analysis and numerical simulation are discussed.Keywords: OFDM, fractional fourier transform, internet and information technology
Procedia PDF Downloads 4065004 A Rapid Colorimetric Assay for Direct Detection of Unamplified Hepatitis C Virus RNA Using Gold Nanoparticles
Authors: M. Shemis, O. Maher, G. Casterou, F. Gauffre
Abstract:
Hepatitis C virus (HCV) is a major cause of chronic liver disease with a global 170 million chronic carriers at risk of developing liver cirrhosis and/or liver cancer. Egypt reports the highest prevalence of HCV worldwide. Currently, two classes of assays are used in the diagnosis and management of HCV infection. Despite the high sensitivity and specificity of the available diagnostic assays, they are time-consuming, labor-intensive, expensive, and require specialized equipment and highly qualified personal. It is therefore important for clinical and economic terms to develop a low-tech assay for the direct detection of HCV RNA with acceptable sensitivity and specificity, short turnaround time, and cost-effectiveness. Such an assay would be critical to control HCV in developing countries with limited resources and high infection rates, such as Egypt. The unique optical and physical properties of gold nanoparticles (AuNPs) have allowed the use of these nanoparticles in developing simple and rapid colorimetric assays for clinical diagnosis offering higher sensitivity and specificity than current detection techniques. The current research aims to develop a detection assay for HCV RNA using gold nanoparticles (AuNPs). Methods: 200 anti-HCV positive samples and 50 anti-HCV negative plasma samples were collected from Egyptian patients. HCV viral load was quantified using m2000rt (Abbott Molecular Inc., Des Plaines, IL). HCV genotypes were determined using multiplex nested RT- PCR. The assay is based on the aggregation of AuNPs in presence of the target RNA. Aggregation of AuNPs causes a color shift from red to blue. AuNPs were synthesized using citrate reduction method. Different sets of probes within the 5’ UTR conserved region of the HCV genome were designed, grafted on AuNPs and optimized for the efficient detection of HCV RNA. Results: The nano-gold assay could colorimetrically detect HCV RNA down to 125 IU/ml with sensitivity and specificity of 91.1% and 93.8% respectively. The turnaround time of the assay is < 30 min. Conclusions: The assay allows sensitive and rapid detection of HCV RNA and represents an inexpensive and simple point-of-care assay for resource-limited settings.Keywords: HCV, gold nanoparticles, point of care, viral load
Procedia PDF Downloads 2065003 The Influence of Perceived Quality, Customer Satisfaction and Brand Attitude to Brand Loyalty of Adult Magazine in Indonesia (A Case Study of Maxim Magazine)
Authors: Robert Ab Butarbutar, Sutan Musa Buyana
Abstract:
Purpose: The purpose of this study is to empirically test the correlation between several variables: perceived quality, overall customer satisfaction and brand attitude to brand loyalty on Maxim magazine in Indonesia. Since the room of adult magazine in Indonesia is restricted, the study of this category has became so interesting to reveal how those variables occur. Design/ methodology/ approach: The combination of exploratory, descriptive and causal research design used in this study. Non-probability sampling, specifically purposive sampling used to determine 160 respondents. Path analysis used to examine the contribution of antecedents variables, perceived quality, overall satisfaction and brand attitude in contribution to brand loyalty. Additional respondents serve for in-depth interview to enrich findings from questionnaire that directly distributed. Findings: The research shows that perceived quality positively contribute to overall satisfaction and brand attitude. Overall satisfaction also positively influence brand attitude and brand loyalty. Finally, brand attitude directly impact to brand loyalty. Despite the hypothesis testing, qualitative research also shows specific behavior of Indonesian customer in consuming adult magazine. Research limitation/implication: This research limited to adult male (18 years at minimum) and who live in big city as Jakarta. Broader geographical coverage is advisable for further research. This study also serves a call for additional empirical research into different product category that targeted to adult male, Since the research of this segment is quite scarce. Managerial Implications: Since findings show perceived quality positively impact and strong contribute to overall satisfaction and brand attitude, it implies for adult magazine to be driven by quality of content. The selection of model, information of current lifestyle of urban male became prioritizes in developing perceived quality. Differentiation also emerges as critical issues since consumer difficult to differentiate significantly one magazine to another. The way magazine deliver its content toward distinctive communication is highly recommended. Furthermore, brand loyalty faces big challenge. Interactivity toward events and social media become critically important. Originality/ value: perceived quality plays as prerequisite to develop overall satisfaction and brand attitude. Finding shows customer difficult to differentiate among adult magazines. Therefore, brand loyalty become a big challenge for company.Keywords: perceived quality, overall satisfaction, brand attitude, adult magazine
Procedia PDF Downloads 4085002 Application of Strategic Management Tools
Authors: Abenezer Nigussie
Abstract:
Strategic control practice is a critical exercise, as it provides a sturdy influence towards firms or production partners to achieve the full implementation of effective predetermined plans. The importance of strategic control in a company is often measured by observing the relationship between strategic management and organizational performance. The conventional philosophy of strategic control in academia and the industry places significant emphasis on the ability to plan and execute initiatives. In contrast, the same emphasis on strategic management has received less attention in the housing industry. Although the pressures of project performance can often obscure the wider social, economic, and professional context in which strategic management is undertaken, it is these broad contextual areas that make strategic control a vital issue for construction businesses. Rapidly changing social and technological issues are creating an informed environment that will appear very different in the coming decades from what is experienced in today’s companies. Construction project activity is not adequately led by strategic management tools; projects are mostly executed through simple plans and schedules. The issues that this thesis addresses and solves involve the successful accompaniment of the construction project process through these strategic management tools. The second important aspect is an evaluation of project activity, which is mostly done through simple economic and technical valuation. However, during this research, effective strategic management tools are evaluated and suggested for the assessment of project activities. The research introduces a study of the current strategic management practices of construction companies and also presents the concept of strategic management and the areas that companies need to address to compete in the global market. A summary of an industry survey is documented along with the historical research that prompted the investigation of these topics with a focus on the implementation of tools. Strategic management is a concept that concerns making decisions and taking corrective actions to achieve the future goals and objectives of a company. The objective of this paper is to review the practice of strategic management in construction companies. Questionnaires were distributed to major construction companies listed under categories of each project capable of specifying the complete expression of strategic management tools. Findings of the research showed that the majority of development companies practice strategic management tools in the process and implementation of each tool.Keywords: strategic management, management, analysis, project management
Procedia PDF Downloads 675001 Leadership Lessons from Female Executives in the South African Oil Industry
Authors: Anthea Carol Nefdt
Abstract:
In this article, observations are drawn from a number of interviews conducted with female executives in the South African Oil Industry in 2017. Globally, the oil industry represents one of the most male-dominated organisational structures as well as cultures in the business world. Some of the remarkable women, who hold upper management positions, have not only emerged from the science and finance spheres (equally gendered organisations) but also navigated their way through an aggressive, patriarchal atmosphere of rivalry and competition. We examine various mythology associated with the industry, such as the cowboy myth, the frontier ideology and the queen bee syndrome directed at female executives. One of the themes to emerge from my interviews was the almost unanimous rejection of the ‘glass ceiling’ metaphor favoured by some Feminists. The women of the oil industry rather affirmed a picture of their rise to leadership positions through a strategic labyrinth of challenges and obstacles both in terms of gender and race. This article aims to share the insights of women leaders in a complex industry through both their reflections and a theoretical Feminist lens. The study is located within the South African context and given our historical legacy, it was optimal to use an intersectional approach which would allow issues of race, gender, ethnicity and language to emerge. A qualitative research methodological approach was employed as well as a thematic interpretative analysis to analyse and interpret the data. This research methodology was used precisely because it encourages and acknowledged the experiences women have and places these experiences at the centre of the research. Multiple methods of recruitment of the research participants was utilised. The initial method of recruitment was snowballing sampling, the second method used was purposive sampling. In addition to this, semi-structured interviews gave the participants an opportunity to ask questions, add information and have discussions on issues or aspects of the research area which was of interest to them. One of the key objectives of the study was to investigate if there was a difference in the leadership styles of men and women. Findings show that despite the wealth of literature on the topic, to the contrary some women do not perceive a significant difference in men and women’s leadership style. However other respondents felt that there were some important differences in the experiences of men and women superiors although they hesitated to generalise from these experiences Further findings suggest that although the oil industry provides unique challenges to women as a gendered organization, it also incorporates various progressive initiatives for their advancement.Keywords: petroleum industry, gender, feminism, leadership
Procedia PDF Downloads 1625000 Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow
Authors: Mohammad H. Taheri, David J. Brown, Nasser Sherkat
Abstract:
Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.Keywords: affective computing in education, affect detection, continuous performance test, engagement, flow, HCI, interaction, learning disabilities, machine learning, multimodal, multisensor, physiological sensors, student engagement
Procedia PDF Downloads 944999 Terrestrial Laser Scans to Assess Aerial LiDAR Data
Authors: J. F. Reinoso-Gordo, F. J. Ariza-López, A. Mozas-Calvache, J. L. García-Balboa, S. Eddargani
Abstract:
The DEMs quality may depend on several factors such as data source, capture method, processing type used to derive them, or the cell size of the DEM. The two most important capture methods to produce regional-sized DEMs are photogrammetry and LiDAR; DEMs covering entire countries have been obtained with these methods. The quality of these DEMs has traditionally been evaluated by the national cartographic agencies through punctual sampling that focused on its vertical component. For this type of evaluation there are standards such as NMAS and ASPRS Positional Accuracy Standards for Digital Geospatial Data. However, it seems more appropriate to carry out this evaluation by means of a method that takes into account the superficial nature of the DEM and, therefore, its sampling is superficial and not punctual. This work is part of the Research Project "Functional Quality of Digital Elevation Models in Engineering" where it is necessary to control the quality of a DEM whose data source is an experimental LiDAR flight with a density of 14 points per square meter to which we call Point Cloud Product (PCpro). In the present work it is described the capture data on the ground and the postprocessing tasks until getting the point cloud that will be used as reference (PCref) to evaluate the PCpro quality. Each PCref consists of a patch 50x50 m size coming from a registration of 4 different scan stations. The area studied was the Spanish region of Navarra that covers an area of 10,391 km2; 30 patches homogeneously distributed were necessary to sample the entire surface. The patches have been captured using a Leica BLK360 terrestrial laser scanner mounted on a pole that reached heights of up to 7 meters; the position of the scanner was inverted so that the characteristic shadow circle does not exist when the scanner is in direct position. To ensure that the accuracy of the PCref is greater than that of the PCpro, the georeferencing of the PCref has been carried out with real-time GNSS, and its accuracy positioning was better than 4 cm; this accuracy is much better than the altimetric mean square error estimated for the PCpro (<15 cm); The kind of DEM of interest is the corresponding to the bare earth, so that it was necessary to apply a filter to eliminate vegetation and auxiliary elements such as poles, tripods, etc. After the postprocessing tasks the PCref is ready to be compared with the PCpro using different techniques: cloud to cloud or after a resampling process DEM to DEM.Keywords: data quality, DEM, LiDAR, terrestrial laser scanner, accuracy
Procedia PDF Downloads 1014998 The Opinions of Counselor Candidates' regarding Universal Values in Marriage Relationship
Authors: Seval Kizildag, Ozge Can Aran
Abstract:
The effective intervention of counselors’ in conflict between spouses may be effective in increasing the quality of marital relationship. At this point, it is necessary for counselors to consider their own value systems at first and then reflect this correctly to the counseling process. For this reason, it is primarily important to determine the needs of counselors. Starting from this point of view, in this study, it is aimed to reveal the perspective of counselor candidates about the universal values in marriage relation. The study group of the survey was formed by sampling, which is one of the prospective sampling methods. As a criterion being a candidate for counseling area and having knowledge of the concepts of the Marriage and Family Counseling course is based, because, that candidate students have a comprehensive knowledge of the field and that students have mastered the concepts of marriage and family counseling will strengthen the findings of this study. For this reason, 61 counselor candidates, 32 (52%) female and 29 (48%) male counselor candidates, who were about to graduate from a university in south-east Turkey and who took a Marriage and Family Counseling course, voluntarily participated in the study. The average age of counselor candidates’ is 23. At the same time, 70 % of the parents of these candidates brought about their marriage through arranged marriage, 13% through flirting, 8% by relative marriage, 7% through friend circles and 2% by custom. The data were collected through Demographic Information Form and a form titled ‘Universal Values Form in Marriage’ which consists of six questions prepared by researchers. After the data were transferred to the computer, necessary statistical evaluations were made on the data. The qualitative data analysis was used on the data which was obtained in the study. The universal values which include six basic values covering trustworthiness, respect, responsibility, fairness, caring, citizenship, determined under the name as ‘six pillar of character’ are used as base and frequency values of the data were calculated trough content analysis. According to the findings of the study, while the value which most students find the most important value in marriage relation is being reliable, the value which they find the least important is to have citizenship consciousness. Also in this study, it is found out that counselor candidates associate the value of being trustworthiness ‘loyalty’ with (33%) as the highest in terms of frequency, the value of being respect ‘No violence’ with (23%), the value of responsibility ‘in the context of gender roles and spouses doing their owns’ with (35%) the value of being fairness ‘impartiality’ with (25%), the value of being caring ‘ being helpful’ with (25%) and finally as to the value of citizenship ‘love of country’ with (14%) and’ respect for the laws ‘ with (14%). It is believed that these results of the study will contribute to the arrangements for the development of counseling skills for counselor candidates regarding value in marriage and family counseling curricula.Keywords: caring, citizenship, counselor candidate, fairness, marriage relationship, respect, responsibility, trustworthiness, value system
Procedia PDF Downloads 2724997 Lotus Mechanism: Validation of Deployment Mechanism Using Structural and Dynamic Analysis
Authors: Parth Prajapati, A. R. Srinivas
Abstract:
The purpose of this paper is to validate the concept of the Lotus Mechanism using Computer Aided Engineering (CAE) tools considering the statics and dynamics through actual time dependence involving inertial forces acting on the mechanism joints. For a 1.2 m mirror made of hexagonal segments, with simple harnesses and three-point supports, the maximum diameter is 400 mm, minimum segment base thickness is 1.5 mm, and maximum rib height is considered as 12 mm. Manufacturing challenges are explored for the segments using manufacturing research and development approaches to enable use of large lightweight mirrors required for the future space system.Keywords: dynamics, manufacturing, reflectors, segmentation, statics
Procedia PDF Downloads 3734996 Using Nonhomogeneous Poisson Process with Compound Distribution to Price Catastrophe Options
Authors: Rong-Tsorng Wang
Abstract:
In this paper, we derive a pricing formula for catastrophe equity put options (or CatEPut) with non-homogeneous loss and approximated compound distributions. We assume that the loss claims arrival process is a nonhomogeneous Poisson process (NHPP) representing the clustering occurrences of loss claims, the size of loss claims is a sequence of independent and identically distributed random variables, and the accumulated loss distribution forms a compound distribution and is approximated by a heavy-tailed distribution. A numerical example is given to calibrate parameters, and we discuss how the value of CatEPut is affected by the changes of parameters in the pricing model we provided.Keywords: catastrophe equity put options, compound distributions, nonhomogeneous Poisson process, pricing model
Procedia PDF Downloads 1674995 Systematic and Simple Guidance for Feed Forward Design in Model Predictive Control
Authors: Shukri Dughman, Anthony Rossiter
Abstract:
This paper builds on earlier work which demonstrated that Model Predictive Control (MPC) may give a poor choice of default feed forward compensator. By first demonstrating the impact of future information of target changes on the performance, this paper proposes a pragmatic method for identifying the amount of future information on the target that can be utilised effectively in both finite and infinite horizon algorithms. Numerical illustrations in MATLAB give evidence of the efficacy of the proposal.Keywords: model predictive control, tracking control, advance knowledge, feed forward
Procedia PDF Downloads 5474994 A Digital Environment for Developing Mathematical Abilities in Children with Autism Spectrum Disorder
Authors: M. Isabel Santos, Ana Breda, Ana Margarida Almeida
Abstract:
Research on academic abilities of individuals with autism spectrum disorder (ASD) underlines the importance of mathematics interventions. Yet the proposal of digital applications for children and youth with ASD continues to attract little attention, namely, regarding the development of mathematical reasoning, being the use of the digital technologies an area of great interest for individuals with this disorder and its use is certainly a facilitative strategy in the development of their mathematical abilities. The use of digital technologies can be an effective way to create innovative learning opportunities to these students and to develop creative, personalized and constructive environments, where they can develop differentiated abilities. The children with ASD often respond well to learning activities involving information presented visually. In this context, we present the digital Learning Environment on Mathematics for Autistic children (LEMA) that was a research project conducive to a PhD in Multimedia in Education and was developed by the Thematic Line Geometrix, located in the Department of Mathematics, in a collaboration effort with DigiMedia Research Center, of the Department of Communication and Art (University of Aveiro, Portugal). LEMA is a digital mathematical learning environment which activities are dynamically adapted to the user’s profile, towards the development of mathematical abilities of children aged 6–12 years diagnosed with ASD. LEMA has already been evaluated with end-users (both students and teacher’s experts) and based on the analysis of the collected data readjustments were made, enabling the continuous improvement of the prototype, namely considering the integration of universal design for learning (UDL) approaches, which are of most importance in ASD, due to its heterogeneity. The learning strategies incorporated in LEMA are: (i) provide options to custom choice of math activities, according to user’s profile; (ii) integrates simple interfaces with few elements, presenting only the features and content needed for the ongoing task; (iii) uses a simple visual and textual language; (iv) uses of different types of feedbacks (auditory, visual, positive/negative reinforcement, hints with helpful instructions including math concept definitions, solved math activities using split and easier tasks and, finally, the use of videos/animations that show a solution to the proposed activity); (v) provides information in multiple representation, such as text, video, audio and image for better content and vocabulary understanding in order to stimulate, motivate and engage users to mathematical learning, also helping users to focus on content; (vi) avoids using elements that distract or interfere with focus and attention; (vii) provides clear instructions and orientation about tasks to ease the user understanding of the content and the content language, in order to stimulate, motivate and engage the user; and (viii) uses buttons, familiarly icons and contrast between font and background. Since these children may experience little sensory tolerance and may have an impaired motor skill, besides the user to have the possibility to interact with LEMA through the mouse (point and click with a single button), the user has the possibility to interact with LEMA through Kinect device (using simple gesture moves).Keywords: autism spectrum disorder, digital technologies, inclusion, mathematical abilities, mathematical learning activities
Procedia PDF Downloads 1164993 Understanding the Nature of Student Conceptions of Mathematics: A Study of Mathematics Students in Higher Education
Authors: Priscilla Eng Lian Murphy
Abstract:
This study examines the nature of student conceptions of mathematics in higher education using quantitative research methods. This study validates the Short Form of Conception of Mathematics survey as well as reveals the epistemological nature of student conceptions of mathematics. Using a random sample of mathematics students in Australia and New Zealand (N=274), this paper highlighted three key findings, of relevance to lecturers in higher education. Firstly, descriptive data shows that mathematics students in Australia and New Zealand reported that mathematics is about numbers and components, models and life. Secondly, models conceptions of mathematics predicted strong examination performances using regression analyses; and thirdly, there is a positive correlation between high mathematics examination scores and cohesive conceptions of mathematics.Keywords: higher education, learning mathematics, mathematics performances, student conceptions of mathematics
Procedia PDF Downloads 2644992 Residual Life Estimation of K-out-of-N Cold Standby System
Authors: Qian Zhao, Shi-Qi Liu, Bo Guo, Zhi-Jun Cheng, Xiao-Yue Wu
Abstract:
Cold standby redundancy is considered to be an effective mechanism for improving system reliability and is widely used in industrial engineering. However, because of the complexity of the reliability structure, there is little literature studying on the residual life of cold standby system consisting of complex components. In this paper, a simulation method is presented to predict the residual life of k-out-of-n cold standby system. In practical cases, failure information of a system is either unknown, partly unknown or completely known. Our proposed method is designed to deal with the three scenarios, respectively. Differences between the procedures are analyzed. Finally, numerical examples are used to validate the proposed simulation method.Keywords: cold standby system, k-out-of-n, residual life, simulation sampling
Procedia PDF Downloads 4014991 Design Development and Qualification of a Magnetically Levitated Blower for C0₂ Scrubbing in Manned Space Missions
Authors: Larry Hawkins, Scott K. Sakakura, Michael J. Salopek
Abstract:
The Marshall Space Flight Center is designing and building a next-generation CO₂ removal system, the Four Bed Carbon Dioxide Scrubber (4BCO₂), which will use the International Space Station (ISS) as a testbed. The current ISS CO2 removal system has faced many challenges in both performance and reliability. Given that CO2 removal is an integral Environmental Control and Life Support System (ECLSS) subsystem, the 4BCO2 Scrubber has been designed to eliminate the shortfalls identified in the current ISS system. One of the key required upgrades was to improve the performance and reliability of the blower that provides the airflow through the CO₂ sorbent beds. A magnetically levitated blower, capable of higher airflow and pressure than the previous system, was developed to meet this need. The design and qualification testing of this next-generation blower are described here. The new blower features a high-efficiency permanent magnet motor, a five-axis, active magnetic bearing system, and a compact controller containing both a variable speed drive and a magnetic bearing controller. The blower uses a centrifugal impeller to pull air from the inlet port and drive it through an annular space around the motor and magnetic bearing components to the exhaust port. Technical challenges of the blower and controller development include survival of the blower system under launch random vibration loads, operation in microgravity, packaging under strict size and weight requirements, and successful operation during 4BCO₂ operational changeovers. An ANSYS structural dynamic model of the controller was used to predict response to the NASA defined random vibration spectrum and drive minor design changes. The simulation results are compared to measurements from qualification testing the controller on a vibration table. Predicted blower performance is compared to flow loop testing measurements. Dynamic response of the system to valve changeovers is presented and discussed using high bandwidth measurements from dynamic pressure probes, magnetic bearing position sensors, and actuator coil currents. The results presented in the paper show that the blower controller will survive launch vibration levels, the blower flow meets the requirements, and the magnetic bearings have adequate load capacity and control bandwidth to maintain the desired rotor position during the valve changeover transients.Keywords: blower, carbon dioxide removal, environmental control and life support system, magnetic bearing, permanent magnet motor, validation testing, vibration
Procedia PDF Downloads 1364990 Computerized Adaptive Testing for Ipsative Tests with Multidimensional Pairwise-Comparison Items
Authors: Wen-Chung Wang, Xue-Lan Qiu
Abstract:
Ipsative tests have been widely used in vocational and career counseling (e.g., the Jackson Vocational Interest Survey). Pairwise-comparison items are a typical item format of ipsative tests. When the two statements in a pairwise-comparison item measure two different constructs, the item is referred to as a multidimensional pairwise-comparison (MPC) item. A typical MPC item would be: Which activity do you prefer? (A) playing with young children, or (B) working with tools and machines. These two statements aim at the constructs of social interest and investigative interest, respectively. Recently, new item response theory (IRT) models for ipsative tests with MPC items have been developed. Among them, the Rasch ipsative model (RIM) deserves special attention because it has good measurement properties, in which the log-odds of preferring statement A to statement B are defined as a competition between two parts: the sum of a person’s latent trait to which statement A is measuring and statement A’s utility, and the sum of a person’s latent trait to which statement B is measuring and statement B’s utility. The RIM has been extended to polytomous responses, such as preferring statement A strongly, preferring statement A, preferring statement B, and preferring statement B strongly. To promote the new initiatives, in this study we developed computerized adaptive testing algorithms for MFC items and evaluated their performance using simulations and two real tests. Both the RIM and its polytomous extension are multidimensional, which calls for multidimensional computerized adaptive testing (MCAT). A particular issue in MCAT for MPC items is the within-person statement exposure (WPSE); that is, a respondent may keep seeing the same statement (e.g., my life is empty) for many times, which is certainly annoying. In this study, we implemented two methods to control the WPSE rate. In the first control method, items would be frozen when their statements had been administered more than a prespecified times. In the second control method, a random component was added to control the contribution of the information at different stages of MCAT. The second control method was found to outperform the first control method in our simulation studies. In addition, we investigated four item selection methods: (a) random selection (as a baseline), (b) maximum Fisher information method without WPSE control, (c) maximum Fisher information method with the first control method, and (d) maximum Fisher information method with the second control method. These four methods were applied to two real tests: one was a work survey with dichotomous MPC items and the other is a career interests survey with polytomous MPC items. There were three dependent variables: the bias and root mean square error across person measures, and measurement efficiency which was defined as the number of items needed to achieve the same degree of test reliability. Both applications indicated that the proposed MCAT algorithms were successful and there was no loss in measurement proficiency when the control methods were implemented, and among the four methods, the last method performed the best.Keywords: computerized adaptive testing, ipsative tests, item response theory, pairwise comparison
Procedia PDF Downloads 2464989 Correlation of Clinical and Sonographic Findings with Cytohistology for Diagnosis of Ovarian Tumours
Authors: Meenakshi Barsaul Chauhan, Aastha Chauhan, Shilpa Hurmade, Rajeev Sen, Jyotsna Sen, Monika Dalal
Abstract:
Introduction: Ovarian masses are common forms of neoplasm in women and represent 2/3rd of gynaecological malignancies. A pre-operative suggestion of malignancy can guide the gynecologist to refer women with suspected pelvic mass to a gynecological oncologist for appropriate therapy and optimized treatment, which can improve survival. In the younger age group preoperative differentiation into benign or malignant pathology can decide for conservative or radical surgery. Imaging modalities have a definite role in establishing the diagnosis. By using International Ovarian Tumor Analysis (IOTA) classification with sonography, costly radiological methods like Magnetic Resonance Imaging (MRI) / computed tomography (CT) scan can be reduced, especially in developing countries like India. Thus, this study is being undertaken to evaluate the role of clinical methods and sonography for diagnosis of the nature of the ovarian tumor. Material And Methods: This prospective observational study was conducted on 40 patients presenting with ovarian masses, in the Department of Obstetrics and Gynaecology, at a tertiary care center in northern India. Functional cysts were excluded. Ultrasonography and color Doppler were performed on all the cases.IOTA rules were applied, which take into account locularity, size, presence of solid components, acoustic shadow, dopper flow etc . Magnetic Resonance Imaging (MRI) / computed tomography (CT) scans abdomen and pelvis were done in cases where sonography was inconclusive. In inoperable cases, Fine needle aspiration cytology (FNAC) was done. The histopathology report after surgery and cytology report after FNAC was correlated statistically with the pre-operative diagnosis made clinically and sonographically using IOTA rules. Statistical Analysis: Descriptive measures were analyzed by using mean and standard deviation and the Student t-test was applied and the proportion was analyzed by applying the chi-square test. Inferential measures were analyzed by sensitivity, specificity, negative predictive value, and positive predictive value. Results: Provisional diagnosis of the benign tumor was made in 16(42.5%) and of the malignant tumor was made in 24(57.5%) patients on the basis of clinical findings. With IOTA simple rules on sonography, 15(37.5%) were found to be benign, while 23 (57.5%) were found to be malignant and findings were inconclusive in 2 patients (5%). FNAC/Histopathology reported that benign ovarian tumors were 14 (35%) and 26(65%) were malignant, which was taken as the gold standard. The clinical finding alone was found to have a sensitivity of 66.6% and a specificity of 90.9%. USG alone had a sensitivity of 86% and a specificity of 80%. When clinical findings and IOTA simple rules of sonography were combined (excluding inconclusive masses), the sensitivity and specificity were 83.3% and 92.3%, respectively. While including inconclusive masses, sensitivity came out to be 91.6% and specificity was 89.2. Conclusion: IOTA's simple sonography rules are highly sensitive and specific in the prediction of ovarian malignancy and also easy to use and easily reproducible. Thus, combining clinical examination with USG will help in the better management of patients in terms of time, cost and better prognosis. This will also avoid the need for costlier modalities like CT, and MRI.Keywords: benign, international ovarian tumor analysis classification, malignant, ovarian tumours, sonography
Procedia PDF Downloads 804988 Backstepping Controller for a Variable Wind Speed Energy Conversion System Based on a DFIG
Authors: Sara Mensou, Ahmed Essadki, Issam Minka, Tamou Nasser, Badr Bououlid Idrissi
Abstract:
In this paper we present a contribution for the modeling and control of wind energy conversion system based on a Doubly Fed Induction Generator (DFIG). Since the wind speed is random the system has to produce an optimal electrical power to the Network and ensures important strength and stability. In this work, the Backstepping controller is used to control the generator via two converter witch placed a DC bus capacitor and connected to the grid by a Filter R-L, in order to optimize capture wind energy. All is simulated and presented under MATLAB/Simulink Software to show performance and robustness of the proposed controller.Keywords: wind turbine, doubly fed induction generator, MPPT control, backstepping controller, power converter
Procedia PDF Downloads 189