티스토리 뷰

Their armies of[각주:1] content moderators are[각주:2] expanding[각주:3]


Every other Tuesday at Facebook, and every Friday at YouTube, executives convene to debate the latest problems with[각주:4] hate speech, misinformation[각주:5] and other disturbing content on[각주:6] their platforms[각주:7], and decide what should be removed or left alone. In San Bruno, Susan Wojcicki, YouTube's boss, personally oversees[각주:8] the exercise[각주:9]. In Menlo Park, lower-level execs[각주:10] run Facebook's "Content Standards Forum".


The forum has become a frequent stop on[각주:11] the company's publicity[각주:12] circuit for[각주:13] journalists. Its working groups[각주:14] recommend new guidelines on what to do about, say, a photo showing Hindu women being beaten in Bangladesh that may be inciting violence offline[각주:15] (take it down[각주:16]), a video of police brutality[각주:17] when race riots are[각주:18] taking place (leave it up[각주:19]), or a photo alleging that[각주:20] Donald Trump wore a Ku Klux Klan uniform[각주:21] in the 1990s (leave it up but reduce distribution of it, and inform users it's a fake). Decisions made at these meetings eventually filter down into[각주:22] instructions for[각주:23] thousands of content reviewers around the world. 


Seeing how each company moderates content is[각주:24] encouraging[각주:25]. The two firms no longer regard making such decisions as a peripheral activity[각주:26] but as core to their business. Each employs executives who are thoughtful about the task of[각주:27] making their platforms less toxic while protecting freedom of speech. But that they do this at all is also cause for concern[각주:28]; they are well on their way to becoming "ministries of truth" for a global audience. Never before has such a small number of firms been able to control what billions can say and see. 


Politicians are paying ever more attention to the content these platforms carry, and to the policies they use to evaluate it. On September 5th Sheryl Sandberg, Facebook's number two, and Jack Dorsey, the boss of Twitter, testified before[각주:29] the Senate Select Intelligence Committee[각주:30] on what may be the companies' most notorious foul-up[각주:31], allowing their platforms to be manipulated by Russian operatives[각주:32] seeking to influence the 2016 presidential election[각주:33]. Mr Dorsey later answered pointed questions from[각주:34] a House committee[각주:35] about content moderation[각주:36]. (In the first set of hearings[각주:37] Alphabet, the parent of Google, which also owns YouTube, was represented by an empty chair after[각주:38] refusing to make Larry Page, its co-founder, available[각주:39].) 


Scrutiny of[각주:40] Facebook, Twitter, YouTunbe et al[각주:41] has intensified recently[각주:42]. All three faced calls to ban Alex Jones of Infowars, a conspiracy theorist[각주:43]; Facebook and YouTube eventually did so. At the same time the tech platforms have faced accusations of anti-conservative bias for suppressing certain news[각주:44]. Their loudest critic is President Donald Trump, who has threatened (via Twitter) to regulate them[각주:45]. Straight after the hearings, Jeff Sessions, is attorney-general[각주:46], said that he would discuss with states' attorneys-general the "growing concern" that the platforms are hurting competition and stifling the free exchange of ideas[각주:47]



Protected species 

This turn of events signals the ebbing of[각주:48] a longstanding[각주:49] special legal protection for the companies. Internet firms in America are shielded from[각주:50] legal responsibility for content posted on their services. Section 230 of[각주:51] the Communications Decency Act of 1996[각주:52] treats them as intermediaries[각주:53], not publishers - to protect them from legal jeopardy[각주:54]


When the online industry was limited to young, vulnerable startups this approach was reasonable. A decade ago content moderation was a straightforward job[각주:55]. Only 100m people used Facebook and its community standards fitted on two pages[각주:56]. But today there are 2.2bn monthly users of Facebook and 1.9bn monthly logged-on users of YouTube. They have become central venues for[각주:57] social interaction[각주:58] and for all manner of expression[각주:59], from lucid debate[각주:60] and cat video to conspiracy theories and hate speech. 


At first[각주:61] social-media platforms failed to adjust to[각주:62] the magnitude and complexity of the problems[각주:63] their growth and power were creating, saying that they did not want to be the "arbiters of truth[각주:64]". Yet repeatedly in recent years[각주:65] the two companies, as well as Twitter, have been caught flat-footed by[각주:66] reports of abuse and manipulation of their platforms by trolls, hate groups, conspiracy theorists, misinformation[각주:67] peddlers[각주:68], election meddlers[각주:69] and propagandists[각주:70]. In Myanmar journalists and human-rights experts found that misinformation on Facebook was inciting violence against[각주:71] Muslim Rohyinga[각주:72]. In the aftermath of a mass shooting at[각주:73] a school in Parkland, Florida[각주:74], searches about the shooting on YouTube surfaced conspiracy videos[각주:75] alleging it was a hoax involving[각주:76] "crisis actors[각주:77]". 


In reaction[각주:78], Facebook and YouTube have sharply increased the resources, both human and technological, dedicated to[각주:79] policing their platforms[각주:80]. By the end of this year[각주:81] Facebook will have doubled the number of employees and contractors dedicated to[각주:82] the "safety and security" of the site[각주:83], to 20,000, including 10,000 content reviewers. YouTube will have 10,000 people working on content moderation in some form. They take down millions of posts every month from each platform, guided by thick instruction manuals - the guidelines for "search quality" evaluators at[각주:84] Google, for example, run to 164 pages[각주:85]


Although most of the moderators work for third-party firms, the growth in their numbers has already had an impact on[각주:86] the firms' finances[각주:87]. When Facebook posted disappointing quarterly results in[각주:88] July, causing its market capitalization to[각주:89] drop by over $100bn, higher costs for moderation were partly implicated[각주:90]. Mark Zuckerberg, the firm's chief executive, has said that in the long run[각주:91] the problem of content moderation will have to be solved with artificial intelligence (AI)[각주:92]. In the first three months of 2018 Facebook took some form of action on 7.8m pieces of content that included graphic violence, hate speech or terrorist propaganda, twice as many as in the previous three months, mostly owing to improvements in[각주:93] automated[각주:94] detection[각주:95]. But moderating content requires wisdom, and an algorithm is only as judicious as the principles[각주:96] with which it is programmed


At Facebook's headquarters in Menlo Park, executives instinctively[각주:97] resist making new rules[각주:98] restricting content on[각주:99] free-speech grounds. Many kinds of hateful[각주:100], racist comments are allowed, because they are phrased in such a way[각주:101] as to not specifically target a race, religion or other protected group. Or perhaps they are jokes


Fake news poses different questions[각주:102]. "We don't remove content just for being false," says Monika Bickert, the firm's head of product policy and counterterrorism[각주:103]. What Facebook can do, instead of removing material, she says, is "down-rank" fake news flagged by[각주:104] external fact-checkers[각주:105], meaning it would be viewed by fewer people, and show real information next to it. In hot spots like Myanmar and Sri Lanka, where misinformation has inflamed violence[각주:106], posts may be taken down


YouTube's moderation system is similar to Facebook's, with published guidelines for what is acceptable and detailed instructions for human reviewers. Human monitors decide quickly what to do with content that has been flagged, and most such flagging is done via automated detection. Twitter also uses AI to sniff out fake accounts[각주:107] and some inappropriate content, but it relies more heavily on user reports of harassment[각주:108] and bullying[각주:109]


As social-media platforms police themselves, they will change. They used to be, and still see themselves as, lean and mean[각주:110], keeping employees to a minimum. But Facebook, which has about 25,000 people on its payroll, is likely soon to keep more moderators busy than it has engineers. It and Google[각주:111] may be rich enough to absorb the extra costs[각주:112] and still prosper[각주:113]. Twitter, which is financially weaker, will suffer more. 


More profound change is also possible. If misinformation, hate speech and offensive content are so pervasive, critics say, it is because of the firms' business model: advertising. To sell more and more ads, Facebook's algorithms, for instance, have favoured 
"engaging" content[각주:114]
, which can often be the bad kind. YouTube keeps users on its site by offering them ever more interesting videos, which can also be ever more extreme ones. In other words, to really solve the challenge of content moderation, the big social-media platforms may have to say goodbye to the business model which made them so successful


  1. army ; 4. [an ~ of] 큰 무리[떼], 대군(大群), 대집단 [본문으로]
  2. moderator ; 1. 조절자, 완화자, 중재자, 조정자; 조절기, 조정기. [본문으로]
  3. expand ; 3. 확대하다(amplify, enlarge) ; (토론 따위를) 발전시키다(develop)(into) ; (뜻을) 확충하다. [본문으로]
  4. convene ; [자동사] (회원 등이) 모이다; (회의가) 개최되다. [본문으로]
  5. misinformation ; [명사] 오보 ;; [NOUN] Misinformation is wrong information which is given to someone, often in a deliberate attempt to make them believe something which is not true. [본문으로]
  6. disturbing ; [형용사] 충격적인, 불안감을 주는 ;; [ADJ] Something that is disturbing makes you feel worried or upset. [본문으로]
  7. platform ; 6. 플랫폼(사용 기반이 되는 컴퓨터 시스템·소프트웨어) [본문으로]
  8. oversee ; [타동사][VN] (over·saw[-ˈsɔː], over·seen[-ˈsiːn]) (작업·활동이 제대로 이뤄지는지) 감독하다 ;; 유의어 supervise [본문으로]
  9. exercise ; [명사] 7. (특정 목적을 가진) (…의) 행위, 일, 임무[in]. [본문으로]
  10. exec ; [명사] (비격식) (기업의) 경영자 ;; 미국∙영국 [ɪɡˈzek] [본문으로]
  11. stop ; 3-a. 봉쇄, 틀어막음 ;; 3-b. 방해, 훼방; 방해물 [본문으로]
  12. publicity ; 2. 홍보[광고](업) [본문으로]
  13. circuit ; 1. 순환(로), 순회 (노선) ;; 5. 순회 (지역[행사]), 순례 ; 참조 closed-circuit television [본문으로]
  14. working group 美, 英 (working party (英) ) ; 1. 작업반. ;; 2. 실무 작업팀; 분과회, 분임조. [본문으로]
  15. incite ; [동사] ~ sb (to sth) | ~ sth 선동[조장]하다 ;; [VERB] If someone incites people to behave in a violent or illegal way, they encourage people to behave in that way, usually by making them excited or angry. ;; 미국∙영국 [ɪnˈsaɪt] [본문으로]
  16. take sth down ; 1. (구조물을 해체하여) 치우다 ;; 흐름상 "게시물을 제거하다" 정도의 의미 [본문으로]
  17. police brutality ; 경찰의 만행, 무자비함, 가혹행위 [본문으로]
  18. rice riot ; [명사] (한 공동체 내에서의) 인종 폭동 ;; [미국의 흑백인 같은] 인종문제의 폭동. ;; [NOUN] Race riots are violent fights between people of different races living in the same community. [본문으로]
  19. leave sth up ; 흐름상 "게시물을 그대로 두다" 정도의 의미 [본문으로]
  20. allege ; [동사] (흔히 수동태로) (격식) (증거 없이) 혐의를 제기하다[주장하다] ;; [VERB] If you allege that something bad is true, you say it but do not prove it. [본문으로]
  21. Ku Klux Klan ; [명사] KKK 큐 클럭스 클랜(사회 변화와 흑인의 동등한 권리를 반대하며 폭력을 휘두르는, 미국 남부 주들의 백인 비밀 단체) ;; 미국∙영국 [ˌkuː klʌks ˈklæn] ;; [NOUN] The Ku Klux Klan is a secret organization of white Protestant men in the United States which promotes violence against black people, Jews, and other minorities. [본문으로]
  22. filter down ; 2. (by extension) Of information, resources, communication, etc., to move slowly and in small amounts down to lower levels of people in an organization, population, or group. ;; To pass or spread downward from an upper level to lower levels, as through a filter [본문으로]
  23. instructions ; [명사] 지시, 명령 ;; [NOUN:PLURAL] directions, orders, or recommended rules for guidance, use, etc [본문으로]
  24. moderate ; [mádərèit/mɔ́d-] (-at·ed; -at·ing) [타동사] 1. …을 적당하게 만들다; …을 완화하다; …을 가감하다, 누그러지게 하다. [본문으로]
  25. encouraging ; [형용사] 격려[장려]의, 힘을 북돋아 주는; 유망한(opp. discouraging) ;; [ADJ] Something that is encouraging gives people hope or confidence. [본문으로]
  26. peripheral ; 1. ~ (to sth) (격식) (중요하지 않은) 주변적인, 지엽적인 ;; 미국∙영국 [pəˈrɪfərəl] [본문으로]
  27. be thoughtful about ; …에 대해 생각이 깊은 [본문으로]
  28. cause for concern ; 걱정, 염려, 고민, 우려, 근심의 이유, 원인 [본문으로]
  29. testify ; 1. ~ (against/for sb) | ~ (to/about sth) (특히 법정에서) 증언[진술]하다 ;; 2. [자, 타동사][V (that)] 증명하다 [본문으로]
  30. the Senate Select Intelligence Committee ; 상원 정보 위원회 [본문으로]
  31. foul-up ; [명사] (비격식) (어리석은 실수에 의한) 문제 ;; [NOUN] A foul-up is something that has gone badly wrong as a result of someone's mistakes or carelessness. [본문으로]
  32. operative ; 2. (특히 美) (특히 정부 기관의) 정보원[첩보원] ;; 미국식 [ˈɑːpərətɪv; -reɪt-] 영국식 [ˈɒpərətɪv] [본문으로]
  33. seek to do ; ~하도록 시도, 추구하다 [본문으로]
  34. a pointed question ; 예리한, 날카로운 질문. [본문으로]
  35. House Committee ; 미 하원 위원회 [본문으로]
  36. moderation ; 5. (컴퓨터) 사회(司會) ((뉴스 그룹 등에서 모아지는 메시지 중에서 의미 있는 것만을 고르는 일)) [본문으로]
  37. hearing ; 2. [U, C] 들어 줌, 경청; 듣게 하기, 발언 기회. ;; 3. 청문회. ;; 4. [U, C] (법정 따위의) 증언 청취, 심문(審問), 심문(尋問); (소송 따위의 기초적 증거의) 심리, 심의. [본문으로]
  38. empty chair ; [명사] A podium or chair left empty to highlight the absence of someone from a debate. ;; [타동사] (debating) To draw attention to (a person who is absent from a debate) by providing an empty podium or chair which they would have occupied. [본문으로]
  39. available ; 3. (英) 출마 가능한 ; (美) 당선 가망성이 있는. ;; 5. (후보자가) 유망한, 당선 가망이 있는; (후보 지명 따위를) 수락할 용의가 있는. ;; [본문으로]
  40. scrutiny ; [U] (격식) 정밀 조사, 철저한 검토 ;; 유의어 inspection ;; [NOUN] If a person or thing is under scrutiny, they are being studied or observed very carefully. [본문으로]
  41. et al. ; [약어] (특히 이름들 뒤에 써서) 외(라틴어 et alii/alia에서). ~등 ;; 미국∙영국 [ˌet ˈæl] [본문으로]
  42. intensify ; 한층 강해[격렬해]지다, 증대하다. ;; 강해지다, 거세지다. [본문으로]
  43. conspiracy theorist ; 음모론자 [본문으로]
  44. suppress ; 1. [타동사] (보통 못마땅함) (정부·통치자 등이) 진압하다 ; 유의어 quash ;; 2. [타동사] (보통 못마땅함) (인쇄·발표 등을) 금하다, (정보 등을) 숨기다 [본문으로]
  45. regulate ; 1. 규제[통제/단속]하다 ;; [VERB] To regulate an activity or process means to control it, especially by means of rules. [본문으로]
  46. Attorney General ; (pl. Attorneys General 또는 Attorney Generals) 1. (영국·캐나다 등에서 국왕에 의해 임명되는) 법무상(정부나 국가수반에게 법률 자문을 함) ;; 2. the Atˌtorney ˈGeneral (미국 등의) 법무장관 ;; [NOUN] A country's Attorney General is its chief law officer, who advises its government or ruler. [본문으로]
  47. stifle ; 1. [타동사][VN] (감정 등을) 억누르다, 억압하다 ; 유의어 suppress [본문으로]
  48. ebb ; [자동사] 1. (조수가) 삐다, 써다(away)(<반의어> flow). ;; 2. (힘 따위가) 쇠해지다, 줄다, (재산 따위가) 기울다(away, down, off, out). [본문으로]
  49. longstanding ; [형용사] 오랫동안[여러 해]에 걸친, 다년간의. [본문으로]
  50. shield ; 1. [타동사] ~ sb/sth (from sb/sth) 보호하다, 가리다 [본문으로]
  51. section ; 3. (책·법률·규약 따위의) 절, 항(項)(기호 §); (신문 따위의) 난(欄); 〈음악〉 악절(樂節). [본문으로]
  52. The Communications Decency Act of 1996 (CDA) was the first notable attempt by the United States Congress to regulate pornographic material on the Internet. In 1997, in the landmark case of Reno v. ACLU, the United States Supreme Court struck the anti-indecency provisions of the Act. [본문으로]
  53. intermediary ; [명사] (pl. -ies) ~ (between A and B) 중재자, 중개인 ;; 유의어 mediator, go-between ;; 미국식 [ˌɪntərˈmiːdieri] 영국식 [ˌɪntəˈmiːdiəri] [본문으로]
  54. jeopardy ; [U] 1. (위해·손실 따위의) 위험, 위난. ;; 2. (법률) (재판에서 유죄가 될) 위험성. [본문으로]
  55. straightforward ; 1. 간단한, 쉬운, 복잡하지 않은 ; 유의어 easy [본문으로]
  56. fit ; [타동사] 3. 맞게 하다, 일치시키다((in, into, to, on, onto)); 적응시키다(adapt); (치수·목적 등에) 맞추다(make fit); (치수를 맞추기 위해) <의복을> 입혀 보다 [본문으로]
  57. venue ; 2. 행위[사건]의 현장, 발생지 ;; 3. (정치 회의 등의) 회합 장소; 개최 예정지 ;; 4. (논의·논쟁에서 사람의) 입장, 입각점, 논거 [본문으로]
  58. social interaction ; [명사] (사회) 사회적 상호 관계: 개인간·집단간에 생기는 모든 종류의 상호 관계; 특히 문화적·사회적 활동에서의 상호 영향. [본문으로]
  59. all manner of somebody/something ; 온갖 종류의 ~ ;; many different types of people or things [본문으로]
  60. lucid ; 1. 명쾌한, 명료한 ; 유의어 clear [본문으로]
  61. at first ; 처음에는 ; 참조 firstly [본문으로]
  62. adjust to ; ~에 적응하다 [본문으로]
  63. magnitude ; (~ (of sth)) 1. [U] (격식) (엄청난) 규모[중요도] [본문으로]
  64. arbiter ; [명사] ~ (of sth) 결정권자 ;; [NOUN] An arbiter is a person or institution that judges and settles a quarrel between two other people or groups. [본문으로]
  65. in recent years[months, times] ;; 최근에 [본문으로]
  66. be caught flat-footed by ; ~로부터 무방비 상태로 (공격을) 당하다. [본문으로]
  67. misinformation ; [U] 오보, 와전. ;; [NOUN] Misinformation is wrong information which is given to someone, often in a deliberate attempt to make them believe something which is not true. [본문으로]
  68. peddler ; 3. (소문 등을) 퍼뜨리는 사람 [본문으로]
  69. meddler ; [명사] (못마땅함) 간섭[참견]하려는 사람 ; 유의어 busybody [본문으로]
  70. propagandist ; (격식, 보통 못마땅함) (대개 정치적) 선전원; 전도사 [본문으로]
  71. incite ; ~ sb (to sth) | ~ sth, [타동사] 자극하다, 격려하다, 고무하다(encourage, urge, rouse) ; 선동하다, 교사하다(stir up). [본문으로]
  72. Rohyingya ; The people of Arakan, Myanmar, who speak this language. [본문으로]
  73. in the aftermath of ; ~의 여파로 [본문으로]
  74. Parkland ; 미국 플로리다주 브라워드 카운티에 있는 도시이다. [본문으로]
  75. surface ; [타동사] 3. 세간에 공표하다 ;; 4. 표면화시키다, 드러내다 [본문으로]
  76. hoax ; (특히 불쾌한 일에 대한) 거짓말[장난질] ;; 사람을 속이기, 골탕먹임, 짓궂은 장난; 날조 ;; [NOUN] A hoax is a trick in which someone tells people a lie, for example that there is a bomb somewhere when there is not, or that a picture is genuine when it is not. [본문으로]
  77. crisis actor ; A person employed to portray a disaster victim during emergency drills. [본문으로]
  78. in reaction to ; …에 반응해서. [본문으로]
  79. be dedicated to ; ~에 전념하다, 헌신하다, 할애하다, 사용하다 [본문으로]
  80. police ; 2. [타동사] (위원회 등이) (규칙 준수를) 감시하다 ;; 유의어 monitor [본문으로]
  81. by the end of ; ~끝 무렵에 [본문으로]
  82. contractor ; [명사] 계약자, 도급업자 ;; (여행) 계약자 또는 하청업자란 말로 통상 Local Travel Agent와 같은 뜻으로 사용된다.;; [NOUN] A contractor is a person or company that does work for other people or organizations. [본문으로]
  83. safety and security ; 안전과 안전성 ; 안전과 보안 [본문으로]
  84. evaluator ; [명사] 평가하는 사람, 평가관 [본문으로]
  85. run to ; 1. (치수·양이) ~에 달하다 [본문으로]
  86. have an impact on ; ~에 영향을 주다 [본문으로]
  87. finances ; 3. [pl.] finances (개인단체국가의) 자금[재정] [본문으로]
  88. quarterly ; [형용사, 부사] 1. 연(年) 4회의, 3개월마다 한 번의, 계절마다의. [본문으로]
  89. market capitalization ; 시가 총액* = market cap [본문으로]
  90. implicate ; 1. [종종 수동형으로] (일·사건 따위에) …을 관계[연루]시키다, 말려들게 하다[in, with]. ;; 2. 〔의미〕를 포함하다, 내포[함축]하다. ; 유의어 INVOLVE ;; 3. 밀접하게 결부시키다; (결과적으로) 영향을 주다. [본문으로]
  91. in the long run ; (앞으로 길게 보았을 때) 결국에는 [본문으로]
  92. be solved with ; ~로 (문제를) 해결하다 [본문으로]
  93. owing to ; [전치사] … 때문에 [본문으로]
  94. automated ; [형용사] 자동화된, 자동의 ;; 미국∙영국 [ɔ́:təmèitid] [본문으로]
  95. detection ; [U] 발견, 간파, 탐지 [본문으로]
  96. judicious ; [형용사] (격식, 호감) 신중한, 판단력 있는 ;; [ADJ] If you describe an action or decision as judicious, you approve of it because you think that it shows good judgment and sense. [본문으로]
  97. instinctively ; [부사] 본능적으로; 무의식적으로, (조건) 반사적으로. [본문으로]
  98. resist ; 1. (어떤 일을 받아들이지 않고) 저항[반대]하다 ;; 유의어 oppose [본문으로]
  99. restrict ; 3. [타동사] ~ sth (to sb) (규칙·법으로) 제한[통제]하다 ;; [VERB] If you restrict something, you put a limit on it in order to reduce it or prevent it becoming too great. [본문으로]
  100. hateful ; [형용사] ~ (to sb) 혐오스러운 [본문으로]
  101. phrase ; 1. [타동사][VN] ~ sth (as sth) (말글을 특정한 방식으로) 표현하다 [본문으로]
  102. raise[pose, present] a question ;; 의문을 제기하다 [본문으로]
  103. counterterrorism ; [명사] 보복[대항] 테러 행위; 테러 방지 대책. [본문으로]
  104. flag ; 2. 〔사람·열차 따위〕에 기로 신호[경고]하다(down), 〔정보·명령 따위〕를 기로 전하다. [본문으로]
  105. fact checker ; 오류점검/사실확인팀 [본문으로]
  106. inflame ; (격식) 1. [타동사] 흥분[격앙/격분]시키다 ;; 2. [타동사] (상황을) 악화시키다[걷잡을 수 없게 만들다] [본문으로]
  107. sniff out ; 2. (비격식) ~을 냄새 맡다[알아내다] ;; to find information about somebody/something [본문으로]
  108. harassment ; [명사] [U] 괴롭힘, 애먹음; [C] 고민(거리) [본문으로]
  109. bullying ; [U] 약자를 괴롭히기 [본문으로]
  110. lean and mean ; (야망으로) 기를 쓰고, 눈에 불을 밝히고 [본문으로]
  111. it and Google 대명사 and 고유명사, 자칫 어색하게 느껴질수 있는 주어 나열 형식 [본문으로]
  112. absorb ; 8. [타동사] (경비·손실 등을) 처리하다; (변화·효과 등을) 흡수하다 ;; (모든 비용을) 부담하다 [본문으로]
  113. prosper ; [자동사][V] 번영[번창/번성]하다 ;; 유의어 thrive [본문으로]
  114. engaging ; [형용사] 남의 마음을 끄는, 매력있는, 상냥한, 애교 있는(winning, attractive, charming). [본문으로]
댓글
반응형
공지사항
최근에 올라온 글
최근에 달린 댓글
Total
Today
Yesterday
링크
TAG
more
«   2024/11   »
1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
글 보관함