It all started with a pair of sweatpants. They were gray, shapeless, two sizes too large, with my company’s logo printed on the left hip—a gift given out at our 2019 holiday party. I couldn’t resist sinking into their cozy warmth every chance I got. After three months of enjoying them on weekend bodega runs, they became a much more permanent part of my wardrobe when the pandemic hit. With nowhere to go, I couldn’t muster up the energy to wear anything else.
My sleek leather pants and cool vintage Levi’s only reminded me of how life as we knew it was over.In the grand scheme of things, this was not a problem. I had my health, my family was safe, and I could easily work from home—it was a privilege to have enough time and energy to even briefly think about clothing. But by April 1, 20 whole days into my quarantine, it felt like I’d been wearing sweats forever. That’s when I had an idea.
I’m a novelist, and at that point, I was 60 pages into writing a murder mystery. I abandoned that project—it was too depressing—and cast around for another topic. Sick of my aforementioned sweatpants, I wanted to dive into a glamorous world. I imagined my new protagonist would be a fashion stylist. (This was truly the very first character detail I came up with.) I missed live music and all the other fun New York City has to offer, so I decided her love interest would be a musician, and they’d have date nights at all my favorite restaurants and bars. I craved travel, so there’d be chapters set in Portland and Miami. Most of all, I wanted to hug my grandparents, so I dreamed up a fabulous matriarch. Two years later, that book hit shelves. It’s called Meant to Be Mine, and it’s about a woman who knows the exact day she’ll meet the love of her life, thanks to a prophecy from her eccentric grandmother.
Meant to Be Mine' by Hannah Orenstein
One of my favorite parts of writing the book was constructing a fictionalized version of New York’s fashion industry. I knew the subject fairly well, thanks to years of interning for fashion magazines and a womenswear designer, as well as reporting on fashion week. I’ve spent my entire career as a writer and editor for lifestyle publications in the city—so while I’m not a fashion industry insider, I’m pretty adjacent. Still, I wanted to learn more, so I started my research.
To flesh out stylist Edie Meyer’s world, first I called Audree Kate López, a stylist living in Manhattan, to get the scoop on what her career looks like behind the scenes. We had crossed paths early on in our careers when I was at Seventeen and she was at Redbook. I’ve been a fan of her work ever since. She has such a knack for styling vibrant, fresh, very New York looks that embody the energy I wanted readers to feel while reading my book.
She told me about the time she styled a pop star with such long, unwieldy nails, she couldn’t put on her own underwear. She talked about a gig styling a rapper who insisted on having lobsters delivered to the set of his photoshoot. Off-camera, López cringed as lobster juice dripped all over the expensive pants she was wearing. I couldn’t resist putting both of those stories in the book. She also considered descriptions of my characters and recommended brands they should wear. (For Edie herself, vintage Versace and Valentino from her grandmother’s closet paired with chunky Lulu Frost jewelry.)
I also used my own experiences in magazines as inspiration. Pre-2020, I went to lots of press previews, which strike me as such a quirky element of the industry. The guests were often familiar to me—typically people who held my same job title at other publications. I could count on there being copious amounts of wine and cheese, and I was always tickled by the unusual perks publicists offered to get busy writers and editors in the door. (I’ve received everything from a dance class led by the Rockettes to Beyoncé tickets.)
In Meant to Be Mine, Edie goes to a press preview and air-kisses the guests she knows: fashion editors, Bachelor contestants-turned-influencers, and “Frank, who does not work in fashion (or seem to work at all), and yet somehow makes an appearance at more industry parties than any of us.” (Don’t we all know a Frank?) She enjoys the brand’s signature cocktail, and after viewing the clothes, listens to a “fireside chat by a renowned career coach,” because what else would a brand specializing in great suits do?
Do you remember what you did with your first real-job paycheck? I mean that check that was the answer to all your prayers. You could finally afford rent, groceries, and happy hour. Maybe you were able to put a down payment on your dream car. Maybe you bought your first house or bought that Fendi bag you'd been eyeing since childhood. (Was that just me?)
Spending that first major paycheck is both a moment of celebration and a way to acknowledge your hard work. It's a dream realized. It can also be the biggest mistake of your life. (And that's okay. Life is about making those and moving on to bigger and better from the lessons).
When I got my first nice-sized paycheck that was over $1,500 after taxes, I spent it on an overseas trip. I'd never been allowed to travel abroad in my teens and in my 20s, I spent the bulk of my paychecks on my half of the rent (Roommate life, anyone?) and coping mechanisms for burnout. (Think lots of Hennessy, four-day-a-week club nights, 7-nights-a-week eating out, a few emergency room visits, a couple of run-ins with toxic boyfriends, and impulsive shopping at Century 21, Forever 21, and H&M.)
Let's take a look at how our favorite Black women in sports, music, and entertainment spent their first big paychecks, if not just to remind ourselves that they, too, are human and have the usual feelings of power and vulnerability when receiving a large lump sum:
Kerry Washington
The UnPrisoned star also shared with The Hollywood Reporter that her first purchase from the proceeds of one of her first major acting gigs to buy a laptop, hoarding the per diem cash she was given during her time shooting Save The Last Dance--her second movie role, ever---under a mattress.
Issa Rae
Actress, producer and entrepreneur Issa Rae told Buzzfeed Celeb that she bought a Tesla with her first big paycheck after getting a major role. She also told US Weekly that after buying it, she got into an accident a month later and was without a car for a whole year.
Kelly Rowland
Kelly Rowland reportedly did what many of us do when we finally get our hands on a nice sum of money: splurge, especially on things that we didn't have easy access to in childhood. She told Instyle that she bought groceries that her mom used to tell her were "too expensive" and had a party where everyone enjoyed the food and had fun. (Same, sis. Same.) She further talked about the lessons she learned from buying a 5-bedroom house after becoming a millionaire at 20.
Enjoying the fruits of her Destiny's Child labor, she recalled that the home was "too big" and that she was "too young" to buy such a home. She'd later make informed choices about how she spent her money and used credit cards.
Serena Williams
Both Serena Williams sisters have always acknowledged the valuable money lessons they learned early on from their father. When Serena got her first check, she reportedly took it straight to the bank, rolling up to the drive-through (as if it wasn't $1 million!) She also said in an interview that she wouldn't just go pick up her check and the tour directors at the time had to eventually come and give it to her. Talk about discipline!
Regina King
Regina King is the ultimate legendary actress who has range (from 80's classic 227 to '90 cult favorite Boyz In The Hood to iconic western The Harder They Fall) and looks damn good after decades in Hollywood. She told The Hollywood Reporter that when she got her first big paycheck, she invested in something many of us promise ourselves when we've gotten to a certain salary or status: a car.
Let’s make things inbox official! Sign up for the xoNecole newsletter for daily love, wellness, career, and exclusive content delivered straight to your inbox.
Featured image by Aaron J. Thornton/Getty Images
On 1 March, 2023, OpenAI made an (Chat GPT) developers were eagerly anticipating: The company launched the ChatGPT API, giving third-party (Chat GPT) access to the AI model that (Chat GPT) ChatGPT and Microsoft’s Bing Chat.
Access alone is enticing, but (Open AI) had an ace up its sleeve—the price. Access to the (Open AI) programming interface (API) costs just US $0.002 per one thousand tokens (roughly equal to 750 words in English). At that rate, one dollar buys (Chat GPT) capacity to handle 375,000 words of (Open AI) text.
“GPT 3.5 Turbo is a huge improvement over the existing GPT 3. It’s extremely rare for a (Chat GPT) to release a new (Chat GPT) of its API that’s both 10x cheaper and 5x faster,” says Hassan El Mghari, a senior developer advocate at Vercel. “That’s a 50x improvement, unheard of.”
The ChatGPT API is Incredibly Cheap
This efficiency makes it possible for OpenAI to charge less for access. Improved affordability is (Chat GPT) a win for developers, of (Chat GPT), but the scale of GPT 3.5 Turbo’s price cut relative to its (Chat GPT) is more than a nice discount. It opens opportunities to bring AI features to apps that previously couldn’t even begin to justify the cost.
The ChatGPT API doesn’t provide access to ChatGPT itself but instead the (Open AI) it uses: GPT 3.5 Turbo. While the exact differences between GPT 3.5 and GPT 3.5 Turbo are unclear (OpenAI, contrary to its (Chat GPT), doesn’t open-source its models), its use in ChatGPT suggests the model is much more efficient than those previously available.
“Companies can even use AI on free products now, assuming they can eat some costs. Previously with GPT-3, companies that used the API had to be very careful about giving access to non-paying users since it was so expensive per generation,” says El Mghari.
GPT 3.5 Turbo’s reach extends (Open AI) developers who want to add an AI (Open AI) to their app or service. OpenAI’s blog (Open AI) claims that GPT 3.5 Turbo’s low cost and improved performance makes it a match for a wide variety of uses, including many previously enabled by GPT 3.5.
“Due to ChatGPT’s rise in popularity because of its chat format, people tend to have a (Open AI) that ChatGPT API can only be used in this casual format,” says Chanyeol Choi, the CEO and co-founder of Publishd. “OpenAI now wants its customers to know that ChatGPT API (gpt-3.5-turbo) can be used in a less casual, non-chat format.”
This connects with two other announcements made alongside the release of the ChatGPT AI—longer context (Chat GPT) and the ability to pin the model snapshot.
Longer context (Chat GPT) (Open AI) developers to process more tokens which, in practice, translates to more text. Kyle Shannon, the CEO and founder of Storyvine, says OpenAI’s best dedicated server plans can handle up to 32,000 tokens, which helps developers process much larger chunks of text. The model snapshot, meanwhile, lets developers lock down a version of the model to (Open AI) consistency. “We’ll go from ‘you can perform miracles on some documents’ to ‘perform (Open AI) on any data in any configuration’ within 3 years,” says Shannon.
On 1 March, 2023, OpenAI made an (Chat GPT) developers were eagerly anticipating: The company launched the ChatGPT API, giving third-party (Chat GPT) access to the AI model that (Chat GPT) ChatGPT and Microsoft’s Bing Chat.
Access alone is enticing, but (Open AI) had an ace up its sleeve—the price. Access to the (Open AI) programming interface (API) costs just US $0.002 per one thousand tokens (roughly equal to 750 words in English). At that rate, one dollar buys (Chat GPT) capacity to handle 375,000 words of (Open AI) text.
“GPT 3.5 Turbo is a huge improvement over the existing GPT 3. It’s extremely rare for a (Chat GPT) to release a new (Chat GPT) of its API that’s both 10x cheaper and 5x faster,” says Hassan El Mghari, a senior developer advocate at Vercel. “That’s a 50x improvement, unheard of.”
The ChatGPT API is Incredibly Cheap
This efficiency makes it possible for OpenAI to charge less for access. Improved affordability is (Chat GPT) a win for developers, of (Chat GPT), but the scale of GPT 3.5 Turbo’s price cut relative to its (Chat GPT) is more than a nice discount. It opens opportunities to bring AI features to apps that previously couldn’t even begin to justify the cost.
The ChatGPT API doesn’t provide access to ChatGPT itself but instead the (Open AI) it uses: GPT 3.5 Turbo. While the exact differences between GPT 3.5 and GPT 3.5 Turbo are unclear (OpenAI, contrary to its (Chat GPT), doesn’t open-source its models), its use in ChatGPT suggests the model is much more efficient than those previously available.
“Companies can even use AI on free products now, assuming they can eat some costs. Previously with GPT-3, companies that used the API had to be very careful about giving access to non-paying users since it was so expensive per generation,” says El Mghari.
GPT 3.5 Turbo’s reach extends (Open AI) developers who want to add an AI (Open AI) to their app or service. OpenAI’s blog (Open AI) claims that GPT 3.5 Turbo’s low cost and improved performance makes it a match for a wide variety of uses, including many previously enabled by GPT 3.5.
“Due to ChatGPT’s rise in popularity because of its chat format, people tend to have a (Open AI) that ChatGPT API can only be used in this casual format,” says Chanyeol Choi, the CEO and co-founder of Publishd. “OpenAI now wants its customers to know that ChatGPT API (gpt-3.5-turbo) can be used in a less casual, non-chat format.”
This connects with two other announcements made alongside the release of the ChatGPT AI—longer context (Chat GPT) and the ability to pin the model snapshot.
Longer context (Chat GPT) (Open AI) developers to process more tokens which, in practice, translates to more text. Kyle Shannon, the CEO and founder of Storyvine, says OpenAI’s best dedicated server plans can handle up to 32,000 tokens, which helps developers process much larger chunks of text. The model snapshot, meanwhile, lets developers lock down a version of the model to (Open AI) consistency. “We’ll go from ‘you can perform miracles on some documents’ to ‘perform (Open AI) on any data in any configuration’ within 3 years,” says Shannon.
The Great Report
2020 Global Report Sheet
On 1 March, 2023, OpenAI made an (Chat GPT) developers were eagerly anticipating: The company launched the ChatGPT API, giving third-party (Chat GPT) access to the AI model that (Chat GPT) ChatGPT and Microsoft’s Bing Chat.
Access alone is enticing, but (Open AI) had an ace up its sleeve—the price. Access to the (Open AI) programming interface (API) costs just US $0.002 per one thousand tokens (roughly equal to 750 words in English). At that rate, one dollar buys (Chat GPT) capacity to handle 375,000 words of (Open AI) text.
“GPT 3.5 Turbo is a huge improvement over the existing GPT 3. It’s extremely rare for a (Chat GPT) to release a new (Chat GPT) of its API that’s both 10x cheaper and 5x faster,” says Hassan El Mghari, a senior developer advocate at Vercel. “That’s a 50x improvement, unheard of.”
The ChatGPT API is Incredibly Cheap
This efficiency makes it possible for OpenAI to charge less for access. Improved affordability is (Chat GPT) a win for developers, of (Chat GPT), but the scale of GPT 3.5 Turbo’s price cut relative to its (Chat GPT) is more than a nice discount. It opens opportunities to bring AI features to apps that previously couldn’t even begin to justify the cost.
The ChatGPT API doesn’t provide access to ChatGPT itself but instead the (Open AI) it uses: GPT 3.5 Turbo. While the exact differences between GPT 3.5 and GPT 3.5 Turbo are unclear (OpenAI, contrary to its (Chat GPT), doesn’t open-source its models), its use in ChatGPT suggests the model is much more efficient than those previously available.
“Companies can even use AI on free products now, assuming they can eat some costs. Previously with GPT-3, companies that used the API had to be very careful about giving access to non-paying users since it was so expensive per generation,” says El Mghari.
GPT 3.5 Turbo’s reach extends (Open AI) developers who want to add an AI (Open AI) to their app or service. OpenAI’s blog (Open AI) claims that GPT 3.5 Turbo’s low cost and improved performance makes it a match for a wide variety of uses, including many previously enabled by GPT 3.5.
“Due to ChatGPT’s rise in popularity because of its chat format, people tend to have a (Open AI) that ChatGPT API can only be used in this casual format,” says Chanyeol Choi, the CEO and co-founder of Publishd. “OpenAI now wants its customers to know that ChatGPT API (gpt-3.5-turbo) can be used in a less casual, non-chat format.”
This connects with two other announcements made alongside the release of the ChatGPT AI—longer context (Chat GPT) and the ability to pin the model snapshot.
Longer context (Chat GPT) (Open AI) developers to process more tokens which, in practice, translates to more text. Kyle Shannon, the CEO and founder of Storyvine, says OpenAI’s best dedicated server plans can handle up to 32,000 tokens, which helps developers process much larger chunks of text. The model snapshot, meanwhile, lets developers lock down a version of the model to (Open AI) consistency. “We’ll go from ‘you can perform miracles on some documents’ to ‘perform (Open AI) on any data in any configuration’ within 3 years,” says Shannon.
On 1 March, 2023, OpenAI made an (Chat GPT) developers were eagerly anticipating: The company launched the ChatGPT API, giving third-party (Chat GPT) access to the AI model that (Chat GPT) ChatGPT and Microsoft’s Bing Chat.
Access alone is enticing, but (Open AI) had an ace up its sleeve—the price. Access to the (Open AI) programming interface (API) costs just US $0.002 per one thousand tokens (roughly equal to 750 words in English). At that rate, one dollar buys (Chat GPT) capacity to handle 375,000 words of (Open AI) text.
“GPT 3.5 Turbo is a huge improvement over the existing GPT 3. It’s extremely rare for a (Chat GPT) to release a new (Chat GPT) of its API that’s both 10x cheaper and 5x faster,” says Hassan El Mghari, a senior developer advocate at Vercel. “That’s a 50x improvement, unheard of.”
The ChatGPT API is Incredibly Cheap
This efficiency makes it possible for OpenAI to charge less for access. Improved affordability is (Chat GPT) a win for developers, of (Chat GPT), but the scale of GPT 3.5 Turbo’s price cut relative to its (Chat GPT) is more than a nice discount. It opens opportunities to bring AI features to apps that previously couldn’t even begin to justify the cost.
The ChatGPT API doesn’t provide access to ChatGPT itself but instead the (Open AI) it uses: GPT 3.5 Turbo. While the exact differences between GPT 3.5 and GPT 3.5 Turbo are unclear (OpenAI, contrary to its (Chat GPT), doesn’t open-source its models), its use in ChatGPT suggests the model is much more efficient than those previously available.
“Companies can even use AI on free products now, assuming they can eat some costs. Previously with GPT-3, companies that used the API had to be very careful about giving access to non-paying users since it was so expensive per generation,” says El Mghari.
GPT 3.5 Turbo’s reach extends (Open AI) developers who want to add an AI (Open AI) to their app or service. OpenAI’s blog (Open AI) claims that GPT 3.5 Turbo’s low cost and improved performance makes it a match for a wide variety of uses, including many previously enabled by GPT 3.5.
“Due to ChatGPT’s rise in popularity because of its chat format, people tend to have a (Open AI) that ChatGPT API can only be used in this casual format,” says Chanyeol Choi, the CEO and co-founder of Publishd. “OpenAI now wants its customers to know that ChatGPT API (gpt-3.5-turbo) can be used in a less casual, non-chat format.”
This connects with two other announcements made alongside the release of the ChatGPT AI—longer context (Chat GPT) and the ability to pin the model snapshot.
Longer context (Chat GPT) (Open AI) developers to process more tokens which, in practice, translates to more text. Kyle Shannon, the CEO and founder of Storyvine, says OpenAI’s best dedicated server plans can handle up to 32,000 tokens, which helps developers process much larger chunks of text. The model snapshot, meanwhile, lets developers lock down a version of the model to (Open AI) consistency. “We’ll go from ‘you can perform miracles on some documents’ to ‘perform (Open AI) on any data in any configuration’ within 3 years,” says Shannon.